This week’s guest is former Senate staffer and Conservative Policy Institute Policy Director Rachel Bovard.

Rachel and Inez discuss the contours of our new public-private partnership in tyranny, the novelty of the power woke tech companies wield across national borders, and whether there will ever be a revolt of the competent within the tech industry itself. They also talk about the influence of China on American companies, and whether today’s Millennials and Gen Z might be the last American generations to remember when shelves were fully stocked and stuff just worked.

High Noon is an intellectual download featuring conversations that make possible a free society. The podcast features interesting thinkers from all parts of the political spectrum to discuss the most controversial subjects of the day in a way that hopes to advance our common American future. Hosted by Inez Stepman of Independent Women’s Forum.


TRANSCRIPT

Inez Stepman:

Welcome to High Noon, where we talk about controversial subjects with interesting people. I’m so pleased to have Rachel Bovard back on High Noon. We had her on, if you recall, about a year ago with Vivek Ramaswamy, where they talked about imposing tyranny from tech companies. Of course, Rachel deserves her own episode because she has been truly the canary in the coal mine on this. She has been writing and speaking about this for years, to much detriment and scorn from not only folks on the left but folks on the right as well. I think if anything has happened between the last time we spoke on High Noon, Rachel, and now, it’s that the fact that you are right about a lot of these things has become even more blatantly obvious. Going back a year ago, I think it was already pretty obvious, but you have been writing, as I said, and speaking about this for at least half a decade at this point.

For those who are not aware of Rachel’s background, she worked for Rand Paul in the Senate. She actually comes out of the libertarian side of the Senate, but has been working especially on tech policy for quite some time. She is also policy director over at CPI, that’s Conservative Partnership Institute, and the senior tech columnist at The Federalist. You can find her work at American Mind. She’s a Claremont Lincoln fellow as well. She’s just all over the place, but I think has been truly notable in just teaching us all what it means to be right, and not to like being right, because that means that America has slipped deeper into a public-private partnership in tyranny, is maybe what I would describe this. Feel free to steal that, Rachel.

Rachel Bovard:

I love it.

Inez Stepman:

Let’s kick off this discussion with this. Pretty recently, we got the news that there’s going to be a massive corporate lawsuit to try to enforce the bid that Elon Musk put in on Twitter. A lot of folks were really excited about this possibility of Elon Musk taking over Twitter. Maybe he still will be forced into taking over Twitter. It seemed like it was the last gasp of the idea that we might be able to solve the problem of tech censorship through free market means. We might be able to get our billionaire champion, Elon Musk, and put him up against Jeff Bezos, and we were going to watch them all rock ’em, sock ’em, fight for our basic cultural understanding of free speech. Now that Elon Musk is trying to pull out of that deal, have you seen people coming over to your side and saying, okay, well, if even Elon Musk could not save us in this, does this really have to be a public policy question? Is anybody convinced, or is everybody just still as dug in as they were before?

Rachel Bovard:

Yeah, our oligarchs failed us. I’m so distressed by the fact that the right’s billion-dollar champion failed. Who could have seen this coming?

The Elon Musk takeover, I think, sparked, or at least forced, a conversation that wasn’t really happening when you’re looking at tech policy as it were. When you talk about the policy aspects of the tech debate, when you talk about Google, you talk about Amazon, Facebook, these companies are all lumped together because they are just massive. The cultural impact has been a separate discussion from the fact of how big they are, their market impact, and that’s driven a lot of the discussion on Capitol Hill. Twitter has always been a little bit separate in that regard because I think when most people think about social media censorship, Twitter goes to the top of the stack, because they are so censorious compared, I think. They’re just so blatant as compared to the other platforms, even though I think all of them engage in it.

It forced a discussion about the role that Twitter plays in our speech norms and how that’s developed over the last 10 years. It forced people to reckon with the fact that, no, actually, Twitter, even though it’s small, it’s escaped notice, I think, by a lot of the competition policy types on Capitol Hill because it is actually quite small compared to the other companies, but its impact is in just an outsize impact in shaping national narrative, so basically saying what we talk about on the news, what we talk about in leading newspapers, and increasingly, what happens in companies themselves, starts and ends on Twitter. You’ve seen people be fired over what they say on Twitter. You’ve seen companies change their entire way of doing business because of what happens on Twitter. New York Times and NPR are, arguably, driven by what happens on Twitter. I think people are forced to, at least, recognize the fact that this has changed how we communicate, how we interact, and really has offline consequences for how we behave.

Now, has that provoked some sort of policy response? Yes and no. I think everybody recognizes that because of the platforms’ power, there needs to be some rules of the road here. For the right, we haven’t really seen our old remedies work. Elon Musk may or may not buy Twitter. He may not actually enforce some sort of cultural reckoning with Twitter and change the way it does business.

The other alternative is to make alternative platforms, and we haven’t really even seen that take off. The paramount example of this is Parler and what happened to Parler. But even Truth Social, we aren’t really seeing a runaway alternative to Twitter, blow the doors off the marketplace there either. Twitter presents, I think, an interesting problem in the sense that we don’t really have a policy solution for a platform that, arguably, has had the most impact in changing the contours of how we talk to each other and how we interact.

Inez Stepman:

Yeah. It forces, I think, not just policymakers but ordinary Americans, we’re confronting what is essentially, I think, a question about what we want the shape of our regime to look like. I know from my own personal experience, being pretty much a down-the-line Reaganite, let’s say, five years ago, and then being convinced by you and others on some of these questions, at first, I really wanted to wall off a couple of these questions from the rest of the way that I thought about politics because it was like, “Okay, so this is a unique problem.”

Increasingly, that public-private communication, where the government signals to private companies… I think the best example of this is the vaccine mandate, where people were pretty certain that that mandate would not stand in the courts, but it stood long enough to send out the Bat-Signal to all of the companies, who then enforced a vaccine mandate on millions of Americans. Then the vaccine mandate was struck down, but it didn’t really matter because it had already been enforced.

We see that kind of coordination without formal coordination happening more and more. That problem seems to repeat itself in every single one, especially, of our cultural debates now, where we see private companies acting either with the government or in concert with each other in a weird collusion that looks very different from the kind of traditional antitrust context. It looks very different than the robber barons coordinating, vertical and horizontal integration buying up everything, although there is some of that. We can talk about Google in a moment.

It seems like that kind of collusion or coordination is happening almost on the cultural level, and it’s starting to impact everything. We can’t wall it off from our politics. We are going to have to reshape our politics and our political response to what is essentially, and maybe you can correct me, I don’t think that there has been this kind of tyranny or this particular shape of censorship, for example, to pick just one of many problems. It’s not comparable in that way to the Soviet Union. To some extent, it comes out of China and their social credit system. But there, it’s ruthlessly enforced by the government. Here, we are, for the first time, as far as I can tell, confronting a question of private tyranny.

Rachel Bovard:

Yeah. No, I think that’s right, and I think that’s why it’s befuddled so many of us on the right who come from, again, that classical Reaganite, classical Friedman-esque type of economics, which actually does put the market and the government in two completely separate spheres. But here, you see this sort of collusion that our laws just aren’t prepared to deal with.

It’s interesting, because this is what got me interested in this question in the first place, was back in 2013 when I was working for Senator Paul, and the Snowden leaks came out. There’s so much information to parse in what Edward Snowden revealed, but what was most interesting to me was one aspect of the PRISM program. The PRISM program, if you remember, was that mass surveillance network that the NSA was enforcing on Americans, warrantless spying, essentially, but the way that they were doing it is by using the backdoors of social media companies and of the burgeoning tech economy. The NSA was granted access to Google. Google granted them access to email. Facebook gave them pictures and videos and voice recordings of people. Even Dropbox was giving the government access to all the data that Americans were giving them.

To me, that was, again, the moment you’re describing right now, which is, this is an awkward fusion here. Again, you’ve always seen the government try to compel this information from companies, but they have to go through a process. There’s a due process course of doing this kind of work. Then there’s just the flinging open of the backdoor, which is what was going on here. To me, that turned on so many light bulbs for what could be. No one was really talking about that aspect. Obviously, there was a million other things to talk about, but that piqued my interest.

In the libertarian world, when you were looking at these companies that run on your data, there was always a little bit of head scratching about what could happen, but when you saw that fusion of corporate power and government power, that should have, I think, sent everybody scrambling for basically seeing what we live in now, which is this corp state, the surveillance capitalism that developed out of that now being turned towards surveillance capitalism/cultural enforcement. We don’t have an answer to that.

I think in what I’ve tried to write and think about, I always come back to the fact that, again, these companies also exist not just with technology and surveillance that we’ve never seen but also at a scale that we’ve never seen. When you’re talking about speech norms being changed, you have these companies that just exert massive control over the discourse. We have never seen this before. I think in many ways we’re trying to catch up.

We’re trying to deal with the fact that Google just tracks you at every point in your life, essentially. Even if you’re not using search or email, if you have any access to any app that you need to engage in the modern world, essentially, Google is the backend of that app. What do we do with a company that can reshape who you are in the digital realm? That’s a lawless space. There’s no policy around how these companies are restrained.

When a government can access that, when you have that ideological fusion, which I think you’re seeing right now, between the far left and these companies, that’s a weapon. I think you’re starting to see these companies weaponize a lot of this information, a lot of this data, and take, as you point out, the tip of the cap from the government to say, “Oh, they don’t want you to talk about this? Well, you can’t talk about this on our platform.”

We saw this blatantly with COVID. That was probably the most blatant example, was the signaling from the White House Briefing Room to say, “Well, of course we’re telling Facebook what they can and cannot allow about what people say about COVID.” If you can’t see the spinning red siren around how we talk and interact, and the impact these companies are having on it, I don’t know what else to tell you at this point, because we’re in it. We’re living in the dystopia.

Inez Stepman:

Yeah. It’s one level even more complicated, I think, because of the existence of the administrative state and the permanent positions of government because that is really where those signals are coming from. It’s not just coming from the president or from senators in hearings, although it is. The politics of this are only going one way, and there’s a shared understanding. For example, when Donald Trump was in office, it’s not like Facebook was really taking their cues from Donald Trump. They were still taking their cues from the larger administrative state, which is very, very opposed to Trump’s governing philosophy.

That makes this one level even more disturbing. Theoretically, if they were listening to the elected government, at least the American people and the position of the majority of Americans would influence how these companies behaved. What we’re seeing is the opposite. We’re seeing that there’s a minority position, but that has captured this swath of the ruling class. I would never use that phrase five years ago, but I think it’s the only one that fits, that cuts across those public and private lines. The CEO of Google and the head of any government agency have vastly more in common with each other than they used to. Frankly, the CEO of Google and that agency head and the financier of Hollywood movies, they all have so much more in common with how they see the world than was the case even 20 years ago, and certainly the case 50 or 100 or 150 years ago.

How do you even combat this kind of coordination that is based much less on either direct communication or direct application of power, and much more on, essentially, wink, wink, nod, nod, “We all see the world the same way. These people are really crazy, and we know who should not be spreading ‘disinformation.’ We’re thinking about the same people when we talk to each other,” as opposed to a broader definition that encompasses more of society?

Rachel Bovard:

Yeah. One of the elements I think you’re touching on is the ideological enforcement and adherence at the top of that ruling class, when you have the people at the top controlling the levers of power in society, again, they all think the same way. What I think makes the tech companies unique, if you look at something like, okay, the CEO of Target, let’s just say, imposes a woke agenda, meaning when you go to Target, you have whatever woke psychology pushed down your throat in all the advertisements. That’s one thing. But for the CEO of Google to think that way and to want to enforce that through its company, it’s just tremendously different because of how much data these companies have amassed on you, what they know about you, and what they then can do with it. That’s what makes, I think, tech companies a little bit different than “woke capital” writ large, because these companies really do have your entire life in their servers.

I pointed this out recently in a column I wrote for the New York Post when Google was like, “Oh, by the way, we’re going to delete the data of…” This was in response to the Dobbs decision from the Supreme Court that struck down Roe v. Wade. They said, “Okay, well, if you go to an abortion clinic or any of these other five areas that we’ve deemed sensitive locations, very personal locations, like weight loss clinics, fertility clinics, things like this, we’ll delete your location data.” That caught my eye because it gets to this idea that these companies are tracking you wherever you go, and this is just a tiny amount of the data they collect on you, is your location. This was, for the first time, Google saying, “We are going to selectively go into your profile and remove the data that we think says something about you.”

Well, the converse of that is also true. We can go into your profile and find data that we think says something about you. And if we are weaponized in a specific way, or if we believe that we are ideologically empowered to act in a certain way, then the gloves are off with what they can share with the government, what they can share about you, and with an ideological bent. “Oh, you attended this political rally. You may not have gone to an abortion clinic, but you went to this pro-life rally. We’re going to delete the data of one, and we’re going to amplify the data of the other. We don’t like the fact that you go to this specific Catholic church on Sundays,” or whatever the left is deeming problematic that day.

If Google aligns itself with that ideology, they are in a position to rewrite who you are on a social script, how the rest of the world views you. As we’ve seen on so many occasions, it’s not just the government investigating you. It’s not just action being taken against you. It’s that these companies can rip apart your entire life. It’s a ripple effect. If a Twitter mob goes after you, suddenly companies don’t want to associate with you. Suddenly your bank doesn’t want to work with you. Your employer wants you terminated. They can change the entire way your life is seen and the way in which you interact with society. That, I think, is a tremendous power that we haven’t even started to discuss.

When I write about these things, again, to me, it comes back to scale. There’s a couple things that have to happen. I don’t think companies should be able to exist in this data universe where one company, like Google, controls all of this data about you without any parameters whatsoever. We have no federal data privacy law. We should at least debate whether we want one. In some respect, we should have some rules of the road for data and what these companies know about you.

More than that, we should have, I think, some rules about how these companies govern themselves if they’re going to exist at the scale in which they do. Even if you got rid of Facebook, even if Facebook fell, at some point another speech alternative is going to arise that has the same power that Facebook has. Again, do we want to go back to this idea that, “Well, if Elon Musk owns it, it’s fine, and any company Jeff Bezos owns is evil”? I don’t know that that’s the solution we want.

Inez Stepman:

That brings up the prospect of blackmail as well.

Rachel Bovard:

Yeah.

Inez Stepman:

That’s what I was thinking about while you were talking. I was like, “So they own everything about you. What’s to stop them from going to even, let’s say, legislators who hold policy positions that they don’t like and saying, ‘Well, this might just leak. Your search history might leak, what you’re clicking on online'”?

Then the second thing that was going… These are truly scary things. That was the first thing that went through my mind, is, “My God, this is a treasure trove for blackmail.” Then the second thing, the immediately second thing that went through my mind, was something that Rod Dreher said on this show maybe six months ago. He said, “I do think we’re rapidly approaching the time when, essentially, gatherings, even in person, that are contrary in some way to the dominant narrative, we’re going to have to leave our phones at home.”

Rachel Bovard:

Yeah.

Inez Stepman:

Do you think that’s coming?

Rachel Bovard:

I do. I hate saying this. I hate the fact that this is America in 2022 and we’re having this conversation, but it is so apparent in the last five years how badly our politics, and especially people on the left who are triggered by any kind of discussion they don’t like, want to control the flow of information. They want to control what information you have access to.

If you remember, I think it was in The New York Times, when Clubhouse was peaking, the conversation app that took off during the pandemic when everybody was at home, The New York Times, I think it was The New York Times, complained about the “unfettered conversations” that might be happening on this app, because the app, in their mind, wasn’t doing enough to crack down on what people were saying. Those awful independent thinkers, they might say something, misinformation that might spread because people were just talking out loud.

That is out there, and the logical conclusion of that type of thinking is that, yes, the speaker on your phone, the speaker on your smart devices, that are already listening to you anyway, gets repurposed for some political agenda. If it’s not from the government, it’s from the woke or some ideologically deranged staff at some of these companies, and they repurpose it for their own ends.

Again, I don’t think people really think this through. When you buy an Alexa and you put it in your home, it is listening to you constantly. Now, the companies are like, “Oh, we don’t use this ever. We delete it,” but there’s been example after example after example that they don’t. They use it for their own internal market research or at targeted advertising. But my point is this data exists, and there’s no parameters around what these companies can and can’t do with it. As we’ve already seen on numerous occasions, when you get that signaling from the White House, you get that signaling from any other political leader, all bets are off as to what these companies are then going to do.

Inez Stepman:

What do you tell the person that thinks we’re crazy? What do you tell the person who’s listening to this and says, “You guys, I know that there are some problems with big tech. I know that, for example, they censored the New York Post story about Hunter Biden that subsequently has been confirmed by other major news outlets”? We’ve seen some of the data itself. I think it’s hard to deny that there’s some level of problem, but what do you say to the person who just listened to what we just said and said, “Guys, what are you talking about? Blackmail? Your phone listening to you? Are you guys going tin foil hat on us?”

What do you tell somebody who doesn’t see these incidents as connected? Fundamentally, that’s what we’re talking about. We’re talking about pointing to a pattern in incidents and saying, “This is representative of something broader and deeper in the way that power and decision-making in the public sphere is being changed, and not for the better. How do you convince the skeptic that, actually, yes, your phone blackmailing you is something that we should be worried about in the immediate future?

Yeah. I think part of the problem with how fast this technology has developed… You and I are, I think, of maybe the last generation, the elder millennials, who bridge the divide between the world we live in now, which is just digitally connected completely, tech is everywhere, and how we grew up, which was a very analog way of growing up. We remember VHS, and we remember growing up before iPhones were ubiquitous and all these things. I think that’s given us a lot of benefit, but it’s also happened so quickly that I think people aren’t aware of the full scope of how much we’ve turned over to these companies.

I would urge people just to first… The first thing, I think, to do is to sit back and reflect on how digitally connected your life is, because people just don’t think about it. It’s rote. You pick up your phone; you call an Uber. You tell Siri to tell you the weather. You log into your Gmail and you are connected to all of the Google suite. Your Apple watch connects to your phone, which connects to your computer.

Just think about how all these things interact. That is all driven by your behavior. Where you are at any given time, 10 of your devices know this. When you start to think about it in those terms and you just imagine how much of your data is already out there, the minute you click Buy Now on Amazon, your whole profile is created on what you’re doing, what you’re buying, which goes into a profile designed to target you with ads, obviously; but to do that, they’re making assumptions about what your life is like based on an algorithm. You’ve heard the famous example that Facebook or Google knows a woman is pregnant before she’s told anyone, based on her search history. All of these things go to create a tapestry about you. You’ve turned over all this information.

Now also sit back and think about the fact that, 10 years ago, would you be seeing PayPal and Venmo cut off users because they don’t like what they do offline? Would you see a world in which GoFundMe wouldn’t fundraise for certain causes they don’t like, or people going to trial, not found guilty, but just people going to trial for speaking or doing or saying things that these companies don’t like? Would you consider an instance in which a bank would cut you off, would cut a sitting president of the United States off from his banking services because he’s fallen out of favor, because he’s the worst president that ever existed?

When you combine these two things, that’s what gives me pause to say yes, I actually can see a world in which this happens, because it’s that fusion between everything that exists about you online versus how companies want to respond to you offline. When you combine those two things, it’s a brave new world. I don’t think that people have really woken up to this because it’s been like that bird on the back of the rhinoceros. The huge animal isn’t moving, and so the bird feels very comfortable. Then suddenly it moves, and the bird’s whole world is rocked. That’s where I think we are with technology.

This I think was really apparent during COVID, by the way. If you remember the technology that these companies put out immediately, Google was able to, within 48 hours, or like a couple of weeks of the pandemic being called, Google said, “Oh, by the way, government, here’s all of our location data. It’s aggregated, so you can’t see individual movements, but here’s all of where people have been. So you can make assumptions. Are people following the lockdowns, or are you seeing too much movement here at the grocery store in this specific town?”

COVID made it very apparent, I think, what these companies can turn over or make public should they choose to. When you think about those two things together, it’s not a far leap to say yes, your illicit gathering to talk about, I don’t know, pro-life policy, or whatever it is that day that’s upsetting people, is the next thing on the list.

Inez Stepman:

In fact, you wrote about an example of that, or you tweeted about it. I can’t remember. Eventbrite, which is literally just a way to RSVP for in-person events, took down an event that was, I can’t remember exactly, but it was something COVID related. It was about the vaccines or something that went against the narrative. They actually got rid of an in-person event. Now, of course, you can still meet up in person, but then factor in some of the things that you just said about your phone tracking you and it starts to get truly dystopian very, very quickly.

Rachel Bovard:

Yeah. It was an event by The American Conservative magazine to talk about… The event was about crony capitalism. It was about how the vaccines were developed and big pharma and who’s making money from the vaccines, but the speakers were Peter McCullough and someone else deemed problematic by the establishment for talking about vaccines. I’m blanking on his name, but famously got Joe Rogan… He was on a Joe Rogan podcast that got banned too. But the third speaker was Senator Ron Johnson.

Inez Stepman:

It was Malone, right?

Rachel Bovard:

Malone.

Inez Stepman:

Dr. Malone.

Rachel Bovard:

Robert Malone. Correct. Yeah, Peter McCullough, Robert Malone, Senator Ron Johnson talking about the development of the vaccines, who made money, essentially, from these vaccines. Eventbrite was like, “Yeah, you can’t RSVP for this event on our site.” It’s just like-

Inez Stepman:

Think about that for a moment. A sitting senator. An event with a sitting senator, elected by American voters, is not acceptable to put together an event or advertise for an event online. I know the Claremont Institute has also had that problem, that they have had their ads for their events taken down, but Eventbrite is one step more. That’s what the service does. It sends out email invites to people that you want-

Rachel Bovard:

Well, this is what I think is the crux of the issue for a lot of people on the right, because they’re like, “Oh, these companies, they’ll never go too far because they’ll stop making money. The profit motive will always push these companies to not do this.” At the end of the day, to the point that you just made, this is what Eventbrite does. This is how it makes money. It brings users to its site to RSVP for events. You’re now seeing these companies almost act against their own financial interest to stand up in an ideological way.

That, I think, is what makes this problem so different than anything we’ve seen before, is that you’ve seen these companies turn against their own users to push an ideological agenda. That is something that we on the right never thought could happen. We always thought the profit motive would trump ideology. We always thought that the profit motive would enforce a certain amount of neutrality on the public square. But for whatever reason, these companies, they’re not beholden to it.

You can make arguments for why. I make an argument that it’s because they’re so large. This doesn’t apply to all companies, probably not Eventbrite. But when you look at companies like Google and Facebook and Amazon and Apple, they’re so massive that they can act against their own users because they know their users have nowhere else to go. That’s part of the definition of being a monopoly. I think it goes back to how these companies now control the marketplace ideologically in a way that’s unique and different. There’s many, many examples of this happening.

By the way, going back to Ron Johnson, I think he may be the most censored senator in the United States Senate today. He’s held hearings, hearings that he ran in his capacity, I believe, as ranking member of the Senate’s Homeland Security and Oversight Committee. Senate testimony, sworn Senate testimony was stripped from YouTube for violating its vaccine policies.

When I pointed this out on Twitter, everyone was like, “Oh, whatever. You can go to the committee’s website. You can still access this information.” That’s not the point to me. The point to me is that YouTube, far and away, is where people access this information more than anywhere else. The amount of people that actually go to a Senate committee website to watch a hearing is minuscule compared to the people that access it on YouTube.

YouTube was like, “Oh, what’s going on in your government, what’s going on in the self-government of the United States, we’ve deemed it too dangerous for you to see. I don’t care that you elected these people. We are in charge of that information, and we are going to strip it from the site.” That’s happened to Ron Johnson, I think at least two or three times at this point.

Inez Stepman:

Yeah, and it’s happened to Trump as well, because just airing the footage of his speeches, particularly around election stuff, there have been-

Rachel Bovard:

Just airing pure footage has gotten people kicked off. Yeah.

Inez Stepman:

Just putting up the video of somebody who probably in a matter of months will be running for president… That’ll really put a finer point on this. If you have a candidate for office that is being so censored by these tech companies that it’s impossible to get his message out as a candidate, if it was happening in any other context, we would immediately point to that and say, “Oh, this is actually a threat to democracy,” as we like to talk about so many different times with other things.

Rachel Bovard:

Yeah. That was the issue with the Hunter Biden story, is that it was censoring information critical of the son of a candidate running for president. There were many aspects to that story, the element of the free press, their story not going out, all these things. The one political hinge to democracy was the fact that you can make an argument that that story not circulating protected Joe Biden on the campaign trail. By suppressing information critical of a member of his family, you were denying voters access to information about a candidate for office.

The same is true, by the way, of how political candidates use these platforms because digital ads on social media are almost critical campaign infrastructure for so many candidates at this point. This was especially true of the Trump campaign. They really leveraged social media in a way I don’t think that any other candidate had been as effective at doing, and it’s now a roadmap for how a lot of candidates run their campaigns.

You’re seeing Facebook, YouTube, all these platforms censor political campaign ads as violating their speech laws, not just for candidates but also for causes. American Principles Project ran a couple of ads in different races on the transgender issue in sports saying, “Don’t let transgender athletes into women’s sports.” Their ads were completely banned from the platform.

Yeah, I do think it has a hinge to democracy. I do think you are linking it to the democratic process when you’re saying speech and political speech gets cut off these platforms. That, of course, has a ripple effect on the pureness of our elections when these platforms are so involved.

Inez Stepman:

I wanted to ask about… You say part of the reason, of course, that they can ignore the pushback of their customers is because of their size, at least for some of these companies. The other potential reason, of course, is that their customer base is increasingly outside of the United States. Increasingly, their real clients are either, well, inside the United States are often government entities, or outside of the United States, particularly in Chinese markets.

They have to protect access in Chinese markets at all costs, and that is a financial interest for them. I think everybody at this point has noticed that there is a huge hypocritical disconnect between the cultural positions that these companies so strongly espouse at home and their willingness to do, not just business, but to open up all their intellectual property, to sell the family jewels off, to get access to Chinese markets.

At this point, the Chinese are actually trying to do something that we should have been trying to do for a long time, which is decouple themselves a little bit. I know at least, for example, in Hollywood, they’re starting to severely limit the number of films that can have access to the Chinese market. They may start to do that with other things. Doing business in China is becoming more and more difficult.

Do you think that this is going to make it possible for the U.S. to really incentivize, both negative and positive here I’m talking about, incentivize our companies to work at home and to work within the confines, whether legal or cultural, of the American way, which, among them, would be respecting something like freedom of speech? Do you think that this is a possibility? One of the big problems when we talk about heavily regulating this industry, for example, is that if we are to go into a cold war with China, we actually do need… We do need the products and the innovation of a large part of the tech space. It is a big asset for the United States, but only if it’s actually an asset for the United States.

Rachel Bovard:

Right.

Inez Stepman:

How do we think about the carrot and the stick at really, essentially, recalling some of these great American businesses who are now international businesses more than they are American?

Rachel Bovard:

Yeah, this is a big fear of mine, and I think something we really haven’t grappled with, is how much these companies have hedged their bets with China because it’s where they want to make money. I think from a national security perspective, I find this terrifying because one area of this that I’ve written some about is the race for artificial intelligence. Whoever wins the race for AI rules the world. We are in a race with China to dominate them, to get there first.

The problem is that the companies that we would rely on to do all of this work, to do all of this innovating, are companies like Google, are companies like Apple to some extent, and those companies are very much embedded in China and very happy to do whatever it is they have to do to make their bed in Beijing because, again, the billions of eyeballs that exist there amount to billions of dollars for these companies.

You look at a company like Apple. It just came out, gosh, maybe like six months to a year ago, the MOU they signed five years ago with China saying, okay, if China is going to allow them to do business, these are all the things that Apple is going to agree to, and one of them was to help the Chinese government develop superior technology, which is, at this point, tangibly and materially aiding our biggest geopolitical adversary.

You look at a company like Google, who, again, has some of the best minds in the business working on artificial intelligence, and yet where do they have an AI office? They have an AI office in Beijing. They are doing AI research in partnership with the Chinese. Here at home, they’re also involved in a woke struggle session with a researcher called Timnit Gebru, who they had to fire over any number of reasons. The reasons are, depending on who you talk to, disputed, but it was about her being a Black woman in tech. There’s still massive fallout over these race and culture issues that are stymying Google’s research on AI.

That’s the tip of the spear for our AI research here in America, is Google, which, one, has made its bed in China, and two, can’t get over its woke issues. This is how we’re going to win? This is of concern.

I do think how we bring these companies back to say, “No, look, at some point, you have to draw a line. For all your talk about free speech here in America, we know that tech products made in America or by American companies have been used by the Chinese government to suppress the Uyghurs, to suppress their own people.” The element of hypocrisy is even too large to fathom, I think.

The biggest concern for me is that we have American tech aiding and abetting, again, China, which I believe to be our biggest geopolitical adversary. That’s something that we don’t have control over at this point. I would love, if Republicans take back the House or the Senate, one of their oversight or investigatory hearings should be who among our American tech companies have shown their source code to China. How much does China actually know about our tech infrastructure based on what these companies have given them? That seems like something that we should know, a very baseline thing that we should know, and soon.

Inez Stepman:

Far be it from me to introduce a note of optimism into all of this, but what about the revolt of the competent? The incident you just pointed to about the woke struggle sessions within these AI, at what point do these companies pass over or fire or generally disadvantage competent, smart people who are on the leading edge of whatever tech development they’re doing, a lot of them white and Asian males, for example, who are being told that they are the problem and not getting their next promotion because the company is promoting on the basis of, essentially, diversity quotas?

At what point does the competent engineer in the middle of this, who by no means, don’t get me wrong, by no means do I think that that person necessarily agrees with you and I about culture, politics, but at some point it’s just, “I would like to do the work…” I think that does come through, just to loop it back to Elon Musk for a minute, that does come through in the way that Elon Musk talks about these things. He’s frustrated about it because it’s going to take way longer to reach Mars if he can’t promote on merit.

Rachel Bovard:

Right.

Inez Stepman:

At some point, that does put some kind of hard wall on this, right? There should be some critical mass of people who just want to do this kind of work, and it seems like there would be new companies that would spring up to take them. Maybe this market function won’t work either, and it won’t work the way that we predicted, in the same way that the consumer check on these companies has not worked the way that we predicted. I’ve got to think, at some point, there’s a critical mass of the competent, no?

Rachel Bovard:

I think that’s probably right. I do wonder if the pendulum on these type of issues is starting to swing a little bit. If you remember, Netflix, which I guess is a tech company in and of itself, had a couple of activist employees that, I think, launched a protest against Dave Chappelle or something that was being streamed on Netflix, and Netflix was like… They had a protest and then Netflix fired them. They were like, “We don’t need to deal with this. Bye.”

I think if you look in a handful of these companies, so much of the Sturm und Drang, if you will, is driven by a handful of employees, and they have successfully cowed so much of management for a while, for the last several years, and I’m wondering if, to the point that you’re making, it’s now becoming a drag on competition, like, “We are not able to get our deliverables done, we are not able to make our contracts, because these six employees are protesting whatever the issue du jour is, so we’re not going to put up with this anymore.” That is the only thing I can think that solves the HR crisis that’s happening in a lot of these companies, but it relies on having tough leadership. That, I think, is something we can’t control. That has to come from the top of these companies. Google, as far as I can tell, and a lot of these tech companies, a lot of the leading tech companies anyway, you haven’t seen that happen.

Inez Stepman:

Yeah. One of the things I’m learning, I think, that I would not have, again, predicted from pure ideological grounds is how quickly people become used to a suboptimal or competent system, not just with regard to deliverables. Maybe the new company culture just becomes like the government, where nothing is on time or under budget. It seems like the cause and the consequence are attenuated enough that there’s always some other reason for it, or just some other reason given.

I was thinking about the problem of… The way that I’ve put it to other guests on this podcast has been the problem of planes falling out of the sky. Eventually, if you hire pilots on the basis of quotas and not on merit, you will have more accidents. That being said, there’s only one or two, maximum, airline accidents a year. A small fraction of those are caused by pilot error. I mean major airlines. I know smaller planes go down more frequently. But there’s only a couple of them a year. You can easily imagine a company over the course of years, or even decades, not really making the connection, especially when everything around them, and the narrative and everything, is pushing them not to make that connection, make that connection forbidden to say. You can easily imagine that this could drag on for quite some time before that inevitable consequence of hiring not on merit starts to catch up.

So many people just have gotten used to… I think COVID has been really instrumental in this. People have just gotten used to a worse life in America. That has really shocked me, actually, honestly, coming from the background that I do, with my parents coming from the former Eastern Bloc. I did not think that the American middle class, for example, would tolerate the kind of supply change shortages, the lack of customer service, the fact that flights now are… Every flight I take is delayed or canceled, practically. You can’t count on the airline industry to get you from point A to point B in a way… It was unpleasant in 2019, but nothing like this.

Rachel Bovard:

The fact that I have to wait until October to get a new oven, yeah, it’s shocking.

Inez Stepman:

I thought there would be an immediate revolt against this stuff. Maybe I just haven’t seen it because of where I live in New York or something, but I haven’t seen that kind of huge anger. Inflation, yes, because that’s directly, “Can I pay for groceries?” But the fact that we now go to the grocery store and we’re just like, “Oh, they’re out of that,” even I’ve caught myself being like, “Oh, they’re out of steak today.” I’m like, “When did I get used to this?”

Rachel Bovard:

I think your observation about COVID enforcing that is a really good one, because I think had it happened all at once, had there just been a tear in the universe and suddenly we woke up and we’re dealing with the supply chain crisis, it would be a lot more dramatic. I think people’s outrage would be a lot more in the open. But I think because it was such that slow drip with COVID and because it was like, “Oh, well, there’s a pandemic, and we just have to get used to dealing with this,” and that just became our new way of living, and so now we just put up with it. It is shocking. I don’t know. It’s allowed companies to just get away with things.

If you notice, people are still using the COVID excuse. Companies that have just nothing to do with COVID, they’re like, “Oh, well, you’re not getting this because COVID” or “Our customer service is terrible because COVID” or “You’re not going to get a response for two weeks because COVID.” It’s like, “You’re just using COVID at this point for your own incompetence. There actually isn’t a COVID reason for this to be happening.”

There’s been no equivalent yet, I think, of a French yellow jacket response, if you remember a couple years ago when Macron, I think, raised the gas prices and you saw the working class revolt. There’s been no similar response. I think we’ve just been putting up with it. It’s very British, actually. We’re very stoic about these things now. I never would’ve expected that from Americans.

Inez Stepman:

Yeah. I really wouldn’t have expected that from Americans just culturally. Americans, they invented “the customer is always right.” Vastly different consumer culture in Europe than there has been in America. Yeah, that has really shocked and surprised me that Americans have put up with these kinds of things.

To end on a traditionally depressing note, to your point that you made earlier about us being the last generation to remember the bridging into the digital divide and remembering before everybody had a computer in their pocket 24/7, that makes me wonder if we’re also the last generation that will remember American abundance at the level, and if we’ll be told, perhaps by tech companies, that that is disinformation to say that there was once, I remember it, you can’t tell me it didn’t happen, I once expected that everything would be in stock in the grocery store. I remember that was the case in the United States of America.

Rachel Bovard:

Hey, because we grew up in the ’90s, which I still maintain is the best decade. I loved the ’90s. I love growing up in the ’90s.

But to your point, yes, these companies have the power to rewrite history. People may say, “Oh, well, there’s always been people trying to rewrite history. How is this any different?” Well, it’s different because of the scale at which these companies exist. It’s different because Google is on the backend of everything you do, because Facebook is where over half of people get their news, or Twitter driving the national conversation. We’ve never seen companies with power this unprecedented. If they wanted to rewrite history, if they wanted to stop you from speaking in the public square, they effectively have the tools to do it.

That’s the problem that we’re facing right now, and we don’t yet have a policy response. You can look at antitrust. You can look at breaking up these companies. You can look at Section 230 and trying to make them more responsible for how they police speech online. But until we come to some kind of national agreement that the power these companies have should not reside in two or three companies at a scale that can literally distort and twist and manipulate the public consciousness, we aren’t going to solve this problem.

It’s not as simple as simply go, “I’m not going to use Google,” because you’re using it wherever you are. It’s not as simple as saying, “I’m not going to engage in the digital marketplace,” because how else do you live in the world? We’ve entered a new world, essentially. We’ve entered a digital age in which we have no rules. That’s been great for innovation, and we want to keep certain elements of that, but you have to police it going forward, because that’s what the social contract is all about. We don’t sit here and let the tech companies reshape our entire universe in their image. We set the rules. We’ve entered that democratic lawmaking phase of this debate, and we have to get it right because our lives are being fundamentally transformed at a scale and at a speed and at a scope that I don’t think really anyone is quite yet aware of.

Inez Stepman:

Yeah. What I’m hearing you say is our digital age is going to lead to one of two things. It’s going to lead to a new democratic age or it’s going to lead to tyranny. That’s really the choice we’re facing today. Rachel Bovard, thank you.

Oh, go ahead.

Rachel Bovard:

No, the means are there for both. The means are there for more and enhanced democracy, or the means are there for tyranny. I think that’s why this moment is so important.

Inez Stepman:

Well, on that note, Rachel Bovard, thank you so much for joining High Noon. You can find Rachel’s work over at CPI. You can also find her on Twitter. It’s at @rachelbovard?

Rachel Bovard:

Mm-hmm (affirmative).

Inez Stepman:

Yes, it is @rachelbovard. You can find her work, again, in her tech columnist position over at The Federalist, as well as writing for American Mind, New York Post, and many other illustrious outlets. Rachel, thank you so much for coming on High Noon.

Rachel Bovard:

Thanks, Inez.

Inez Stepman:

And thank you to our listeners. High Noon with Inez Stepman is a production of the Independent Women’s Forum. As always, you can send comments and questions to [email protected]. Please help us out by hitting the subscribe button and leaving us a comment and review on all of those tech companies we were just talking about, Apple Podcasts, Acast, Google Play, YouTube, or iwf.org. Be brave, and we’ll see you next time on High Noon.