00:00
Is tiktok over like over over tiktok has nine months to sell
00:04
or else it's not allowed in the United States.
00:06
But why do they have to sell?
00:07
Well, one of the reasons is privacy concerns.
00:09
And today we actually have Emily Baker White,
00:13
a tech reporter for Forbes who has a very interesting relationship with
00:17
Bytedance, the parent company of tiktok.
00:21
Emily, hey, it's great to be here.
00:25
So we'll, we'll just jump into it.
00:28
So Emily, how do you feel like what was your
00:30
experience with Bytedance? So I've been reporting on Tik Tok and
00:34
Bytedance for several years now and I ended up reporting a big
00:38
leak in 2022 that showed that lots of people in China
00:43
had access to us user data.
00:45
So things like our email addresses,
00:47
our private messages, our private videos,
00:49
etcetera were accessible in China.
00:51
And that mattered because in China,
00:54
there isn't due process and the government can just come and ask
00:57
any employee to turn over all the data and they have to
01:00
do it right. There's no way that they can meaningfully challenge
01:04
that. And so the US government was concerned that the Chinese
01:08
government could be using tiktok to capture data that I wrote this
01:12
big story. The company got really anxious about it and
01:16
a number of employees,
01:17
some of whom were in China tried to figure out who had
01:21
leaked the information to me.
01:23
they use the Tik Tok app to surveil me to try to
01:28
see where I was physically and then to try to compare that
01:31
information with information about where tiktok and byte dance employees were physically
01:36
And what they were trying to,
01:37
to do is figure out am I in the same coffee shop
01:39
at the same time as a bite dance employee or a tiktok
01:42
employee? Are we in the same public park?
01:45
Are we in the same library?
01:46
Right? Because that might be an indication that those people were
01:49
talking to me. And so that caused a huge
01:55
problem for the company when it came out.
01:56
I heard about it from a source and was able to
01:58
corroborate it and publish it.
02:00
And the company later acknowledged that,
02:02
yeah, this happened.
02:03
they fired the people involved but it was a really bad
02:06
look because they, they had been surveilling me in part from
02:09
China while they were assuring everyone that there was no risk that
02:12
their data was accessible in China.
02:15
And so how did I feel when the bill went into
02:18
effect? I felt really nervous and anxious,
02:23
generally speaking, because we're in uncharted territory here.
02:27
The US has never banned an app before.
02:29
And Tik Tok and Bytedance have said they're going to challenge the
02:32
constitutionality of the bill that would require Bytedance to sell or that
02:36
could result in a ban on tiktok.
02:38
And so we're about to be in uncharted legal territory.
02:41
This is gonna be a big case and an unpredictable case and
02:45
it's gonna would be a really important one to
02:46
watch most definitely. So would you say this is what the
02:50
Like, do you think that's why they're banning it or it's
02:53
definitely one of the things they're afraid of?
02:55
So there are two major concerns that the US government has expressed
02:58
about tiktok. One of them is data harvesting and frankly,
03:01
they're probably not that worried about like my general location information,
03:06
that's not that sensitive,
03:07
but the general location information of people who are deployed,
03:11
right? They're worried about that.
03:12
You could also think about like Chinese dissidents,
03:14
people who have stood up to this government before could be a
03:18
sort of additional interest of the government and thus could be an
03:21
additional risk here. The Chinese government also they collect a
03:25
lot of data generally speaking.
03:26
And so they, it it's possible that they and other governments
03:30
are sort of hoovering up all of this data to think what
03:32
might they want to do with it someday.
03:34
And that was expressed as a risk too.
03:35
But the sort of really important other bucket here is the idea
03:40
that the Chinese government could control what we see on tiktok.
03:44
And I know that that's a concern for a lot of lawmakers
03:49
there was some research that was put out suggesting the messages on
03:54
tiktok skewed much more pro Palestine than the messages on other platforms
03:58
And some people use that to say,
04:00
oh, that means that the Chinese government is interfering in what's
04:03
on tiktok. I'm not sure that's what that means.
04:06
There are a lot of young people on tiktok who have
04:09
strong feelings about Israel and Palestine and I,
04:12
I think it may just be those young people expressing their opinions
04:15
and there are different sort of constituencies on every app,
04:18
right? So some lawmakers were really using that as a sign
04:22
I'm not sure I would use that as a sign personally
04:25
But the the concern still exists that if there are people
04:29
in China who are working for this company just doing their jobs
04:32
you know, not doing anything nefarious,
04:34
the government could show up at their,
04:36
at their door one day and say,
04:37
yeah, you're going to have to make this tweet to the
04:39
algorithm and if that happened,
04:40
we wouldn't necessarily know about it.
04:43
we have Tik Tok has nine months to you know,
04:47
but why do you think they,
04:48
they're not going down to fight?
04:50
It sounds like why do you think they won't settle by tiktok
04:54
Yeah. So there are a couple of reasons.
04:56
One reason is that the Chinese government actually gets a say in
04:58
whether by dance is allowed to sell.
05:00
So when Trump tried to ban tiktok several years back,
05:05
he also was trying to force a sale of the app
05:08
And when that happened,
05:08
there were meaningful sale negotiations.
05:10
And at the last minute,
05:12
the Chinese government changed their export rules and said that recommendations,
05:16
algorithms are a sensitive type of asset and that you need to
05:20
get a license before you can sell them.
05:21
And the sort of writing on the wall was,
05:23
yeah, and we're not going to give you a license.
05:25
So it's possible that by dance could get such a license.
05:28
It does not seem likely the Chinese government has said that it
05:30
fully opposes by dance divesting of this asset and by dance itself
05:34
it said it doesn't want to sell,
05:36
it wants to go to court.
05:37
It says this law is unconstitutional.
05:38
We should be forced to sell and think about if,
05:42
if Google or Meta were forced to sell their core tech by
05:48
a government, they go to the government and say,
05:51
I don't think so too.
05:52
So I think we've got two major things going on here.
05:56
We've got a company that doesn't want to sell its core tech
05:58
and I sort of can't blame them.
06:00
And then we've also got the fact that the Chinese government could
06:02
forcibly stop them. Well,
06:05
thank you so much for your time.
06:07
This has all been wild.
06:08
I mean, I have tiktok on my phone.
06:10
Do you still have tiktok on your phone?
06:12
I do not have tiktok on my personal phone anymore.
06:15
I do have a phone through which I can access to it
06:18
Ok. So you're still on the app somehow but not
06:20
through your personal device.
06:23
Well, thank you so much,
06:25
Emily, we really appreciate you being on the show and,
06:28
and telling us, you know,
06:29
exactly what happened or what's going on with tiktok.
06:34
where can people find you and keep up with all this tech
06:37
news and updates? I'm on most of the social platforms.
06:40
I'm reluctantly still on the app that was formerly known as Twitter
06:44
but I'm also on Blue Sky.
06:45
I'm on threads. I'm checking Master on,
06:47
you can find my author page on Forbes.
06:51
that lists my email address.
06:52
I'm pretty easy to find,
06:53
hey, if you know interesting things about tiktok or by dance
06:56
I want to hear from you.
07:00
you can get ways to contact me and please do.
07:02
I'd love to hear from you.
07:03
There you go. Thank you so much for your time and
07:07
We'll be here. Thank you so much.