Skip to playerSkip to main contentSkip to footer
  • 2 days ago
NHKスペシャル 創られた“真実” ディープフェイクの時代
#EnglishMovie #cdrama #drama #engsub #chinesedramaengsub #movieshortfull
Transcript
00:00:00We have confirmed that important information about the three employees has been leaked.
00:00:11We deeply apologize for the inconvenience and concerns we have caused to you.
00:00:20We are deeply sorry.
00:00:22We are sorry.
00:00:24How many people's data have leaked?
00:00:28As far as I know, there are about 200,000.
00:00:33What is the probability of being hacked?
00:00:35Hasn't there been a countermeasure?
00:00:40We have completely abandoned security measures.
00:00:44Sunshine Life, a foreign insurance company that owns a Japanese company in the United States,
00:00:48announced today that 230,000 customers' data has been leaked.
00:00:53There is a risk that the leaked data will be misused,
00:00:56and the contractor is concerned.
00:01:04Please tell us the details of the leaked data.
00:01:06When will the results be announced?
00:01:09We will report as soon as we have progress.
00:01:11Please open the door.
00:01:12What do you think about the customer's insurance?
00:01:14President, please answer!
00:01:17In recent years,
00:01:19companies have been cyber-attacked,
00:01:21and personal information has been taken over.
00:01:29Recently, the use of deepfake has been rampant.
00:01:40Deepfake means using AI to create images and videos that look exactly like the real thing.
00:01:49However, recently, it has been used to deceive people,
00:01:53such as to impersonate them.
00:02:00According to a survey conducted last year by a security company that avoids fraud,
00:02:053 million companies were found to be unauthorized to access it,
00:02:10while 210,000 companies were found to be unauthorized to use deepfake.
00:02:14This is four times more than last year.
00:02:19The question is, what's going to happen when deepfake spreads?
00:02:22It's already happened.
00:02:25And this year,
00:02:27it is predicted that the damage caused by deepfake will be further doubled.
00:02:35The background of this is the existence of fake videos,
00:02:39which are becoming more and more realistic as AI advances.
00:02:45We live in a post-real society
00:02:48where you can't tell if it's real or if it's fake.
00:02:55Last year, a global company based in the UK
00:02:59was deceived by deepfake and lost 4 billion yen.
00:03:05This incident shook the world.
00:03:09This time, we will cover the background of the deepfake incident
00:03:13that actually happened all over the world,
00:03:18and dramatize what is happening.
00:03:23This is a deepfake fraud, isn't it?
00:03:27This story is a factual drama of the near future
00:03:30that depicts the era of deepfake.
00:03:33DEEPFAKE
00:03:43Remy!
00:03:46Remy!
00:03:47Breakfast is ready!
00:03:52I'm going to be late on the first day.
00:03:54Remy!
00:03:56Hey!
00:04:00Breakfast!
00:04:04Good morning.
00:04:05Hurry up and eat.
00:04:06I don't want it.
00:04:07What?
00:04:19Hurry up.
00:04:20Hurry up and eat.
00:04:22Just eat a little bit.
00:04:33Is that e-mail really okay?
00:04:35The link to the e-mail is in the description.
00:04:38Stop the fraud.
00:04:40We are not deceived.
00:04:42I'm Rimei Kusanagi.
00:04:44Meow meow.
00:04:53Good morning.
00:04:54Good morning.
00:04:55Good morning.
00:04:59You can drink coffee here.
00:05:03Good morning.
00:05:04Good morning.
00:05:07The main office is over there.
00:05:13This is the main office.
00:05:16There are about 50 staff members.
00:05:18It's a big deal for a three-year venture.
00:05:22Good morning, everyone.
00:05:24Good morning.
00:05:25If you have time, please come to the hall.
00:05:30I'm Akira Sato, a new employee.
00:05:35I used to be a bank salesman.
00:05:37Please follow me until you get used to it.
00:05:40Sato-kun.
00:05:43I'm Akira Sato.
00:05:48I strongly sympathize with the company's idea of solving people's problems with the power of AI.
00:05:59I've changed my job.
00:06:01I'm looking forward to working with you.
00:06:06Shinji-san.
00:06:08Please answer.
00:06:13Rimei Kusanagi.
00:06:16Fukayama.
00:06:17Stop the deepfake.
00:06:19Yes.
00:06:20I'm sorry.
00:06:24Nice to meet you.
00:06:28Ask the chief, Tsubokura.
00:06:32Tsubokura.
00:06:33Yes.
00:06:38Nice to meet you.
00:06:47Good morning.
00:06:48Good morning.
00:06:52I'll teach you how to make videos with AI.
00:06:55Please get used to it.
00:06:57Okay.
00:06:59First, find a voice asking for help on social media.
00:07:07For example...
00:07:15This is good.
00:07:17It's asking for a pet.
00:07:20I'll ask Sato-san to spread this post.
00:07:24I'm sorry. How?
00:07:27I'll use AI to make a video and re-post it.
00:07:33I'll use an app called Gai.
00:07:36Gai?
00:07:37Yes. It's our original app.
00:07:42Roll the ball and throw it back.
00:07:49I'll instruct you how to move the photo.
00:07:54And then...
00:08:03Oh, it moved.
00:08:06It's good to have a video like this.
00:08:15I'll re-post this.
00:08:25That's amazing.
00:08:27If we get more views, we'll get more ad revenue.
00:08:31It's a bit scary, but...
00:08:34We want people to learn how to make videos.
00:08:42It's a fake, right?
00:08:44That's not the point.
00:08:46The point is how to get attention.
00:08:48And if you spread it, the owner will be happy.
00:08:53I see.
00:08:56Good luck.
00:09:00Good work.
00:09:13Are you okay?
00:09:16Yes.
00:09:18It's easy once you get the hang of it.
00:09:22I didn't know you could do this.
00:09:26He's an expert in AI.
00:09:29He was amazing at the last screening.
00:09:32Oh, the parent-child reunion?
00:09:35Yes.
00:09:36It was amazing. Do you want to see it?
00:09:41Go ahead.
00:09:53What do you think?
00:10:03Is this AI, too?
00:10:06The mother and the background are all AI.
00:10:09I put a link to the fundraiser in this video.
00:10:12The response was amazing.
00:10:14I think we got more than 100 million views.
00:10:16That many?
00:10:18The supporters were very happy.
00:10:21Mr. Tsubokura, you make these things so easily.
00:10:33I think it's this one.
00:10:37The dog from earlier?
00:10:40Oh, I found it.
00:10:43That was fast.
00:10:44I'm glad.
00:10:46It's like this.
00:10:48Bye.
00:10:59I'm home.
00:11:01Welcome home.
00:11:05Oh, Yoko.
00:11:07I'm sorry to bother you.
00:11:09I'm here to investigate a new house.
00:11:12Welcome. Is this curry?
00:11:14It smells good.
00:11:16It's my sister's personal belongings.
00:11:18I see.
00:11:20Thank you for coming.
00:11:22You're welcome.
00:11:31I was surprised that you got a new job.
00:11:35How was it?
00:11:40I heard that AI helps people.
00:11:43I heard that AI helps people.
00:11:46You use deepfake.
00:11:49Deepfake?
00:11:51I use AI to create fake images and videos.
00:11:56You know a lot.
00:11:58I learned it in class today.
00:12:01I didn't know you learned it.
00:12:04It looks good.
00:12:06I wonder if it's good.
00:12:09I'm going to eat in my room.
00:12:13I'm going to eat in my room.
00:12:19I've been like this since Manami died.
00:12:25Lunch?
00:12:29I'm going to work from tomorrow.
00:12:34Did something happen at work?
00:12:37No.
00:12:41Tell me if something happens.
00:12:44I'm sorry. I can't tell you now.
00:12:49You can't tell me?
00:12:52It's okay.
00:12:58Okay.
00:13:08I'm home.
00:13:14Manami?
00:13:24Manami?
00:13:40Manami!
00:13:41Manami!
00:13:43Manami!
00:13:48Manami!
00:13:56Dad.
00:13:59This is my company, right?
00:14:01This is my company, right?
00:14:10Yoko!
00:14:13Yoko!
00:14:15I'm sorry.
00:14:17I saw the news.
00:14:20Manami is involved in customer information.
00:14:26That information hasn't reached Koho.
00:14:29What?
00:14:31I don't know when and how the information was leaked.
00:14:35Really?
00:14:37I can't think of any other reason.
00:14:41It's been four months since then.
00:14:44I still don't know why the customer information was leaked.
00:14:51Diplex?
00:14:53Yes.
00:14:56Manami came here twice three days before she died.
00:15:06I'm sure something happened to her.
00:15:09Is that why she got a new job?
00:15:13Manami's name wasn't in the customer information.
00:15:17I'll keep looking into it.
00:15:20Yoko, let me know if you find anything.
00:15:25Is this Manami's personal computer?
00:15:28Maybe there's a clue.
00:15:30No.
00:15:33There's a face recognition.
00:15:35Face the wall facing the front.
00:15:39What about the picture?
00:15:41I tried it.
00:15:43It didn't work.
00:15:44I see.
00:15:52Is this your new department?
00:15:54Yes. I'm done with my internship.
00:15:57I need to talk to you about something.
00:15:59Okay.
00:16:08Good morning.
00:16:11I was worried about you.
00:16:12I'm sorry.
00:16:14I'm sorry, Mom.
00:16:17I had a cold and was in the hospital.
00:16:22I know.
00:16:24But I'm fine now, so don't worry.
00:16:27Ms. Sukuro.
00:16:29Her voice.
00:16:31I changed it to a Kansai dialect voice.
00:16:34And her face.
00:16:37Look at this.
00:16:43What?
00:16:45Deepfake?
00:16:47This man is her son.
00:16:51She's her granddaughter.
00:16:53So your granddaughter is pretending to be her son?
00:17:00Her son died in a car accident two weeks ago.
00:17:05But her granddaughter couldn't tell her.
00:17:08Why?
00:17:09Why?
00:17:11She has a heart disease.
00:17:15If her son dies,
00:17:18she'll be shocked.
00:17:20But if she can see her son's face,
00:17:24she'll be relieved.
00:17:26Come in.
00:17:30How about a T-shirt for lunch?
00:17:34What about you?
00:17:35How about you?
00:17:37I'm in the hospital.
00:17:39It's comfortable.
00:17:42But I'm lonely.
00:17:46Can't you come here?
00:17:49I don't think you can live that long.
00:17:54Think of me as your last.
00:17:57Come here.
00:18:02What's wrong, Yosuke?
00:18:04Are you crying?
00:18:07What's this?
00:18:11I have an important job.
00:18:14Wait a little longer.
00:18:17Okay.
00:18:19I'm sorry for being selfish.
00:18:22I'll let you go to Suzuka.
00:18:29I'm looking forward to it.
00:18:35Take care.
00:18:42Kimie, it's time to go.
00:18:48Are you pulling my hair?
00:18:51No.
00:19:03Excuse me.
00:19:06Are you okay?
00:19:11Your grandmother was happy.
00:19:13I'm glad.
00:19:15But I can't lie forever.
00:19:21Come to me anytime.
00:19:26Let's go.
00:19:33Excuse me.
00:19:40How was it?
00:19:44I understand the situation.
00:19:46But you lied to your grandmother.
00:19:49Are you still saying that?
00:19:52Deepfake is useful to people.
00:19:55It's hard to work here.
00:20:03The story of Kimie encouraging her grandmother to revive her dead father with deepfake is a story that actually happened in China.
00:20:18I'm Kimie.
00:20:20I live in Beijing.
00:20:22The story of Kimie encouraging her grandmother to revive her dead father with deepfake is a story that actually happened in China.
00:20:27Now, in online shops in China, services that revive dead relatives with AI are becoming popular.
00:20:40There are companies all over the world that use this deepfake technology for business.
00:20:45In this company in the UK, they teach real people to AI and create avatars on computers.
00:21:16The avatars are mainly used for video introductions and internet ads.
00:21:25The words you want the avatar to speak are entered in sentences.
00:21:30Sentences can be converted into 130 languages.
00:21:36Hi, I'm AI avatar of Alex Wojka, and I'm so excited to be here with you today.
00:21:41Hi, I'm AI avatar of Alex Wojka, and I'm so excited to be here with you today.
00:21:46I was created using generative AI in the studio a few months ago.
00:21:54That's pretty good Japanese.
00:21:56Very good Japanese.
00:21:59I've practiced a lot.
00:22:00I've practiced a lot.
00:22:04The technology that allows the avatar to speak in real time will be completed this year.
00:22:12There is no legal regulation against this technology in Japan.
00:22:18The tremendous evolution of AI is beginning to shake our sense of ethics.
00:22:30The prompt is used to give instructions or ask questions to the AI.
00:22:38The more clear and concrete the prompt is, the more the AI will tell you the right answer or generate what you think is right.
00:22:49The prompt can be made by the AI.
00:22:52First of all, it is better to think about the sentences yourself.
00:22:58The AI will listen to what you say.
00:23:04What's wrong, Sato?
00:23:08It's nothing.
00:23:12Are you sure?
00:23:14Let's practice hitting the prompt.
00:23:22Let's practice hitting the prompt.
00:23:40Hey, listen.
00:23:44Hey.
00:23:50Who are you reading to?
00:23:52I didn't say anything.
00:24:01What's wrong with her?
00:24:22Who are you reading to?
00:24:52Who are you reading to?
00:25:22Who are you reading to?
00:25:37I'm home.
00:25:39Konomi.
00:25:43Happy birthday!
00:25:46You scared me!
00:25:47What?
00:25:49You scared me even more.
00:25:51What?
00:25:53I made a cake!
00:25:57Wow!
00:26:04Dad, how is it? Is it good?
00:26:06It's the best!
00:26:09I'm glad.
00:26:12Mom, your eyes are heart-shaped.
00:26:15I'm so happy.
00:26:21I'm so proud of you.
00:26:46How have you been?
00:26:52Akira.
00:26:58Why are you silent?
00:27:01Can you hear me?
00:27:06Akira.
00:27:10Mom.
00:27:11Mom.
00:27:24Remi, how have you been?
00:27:28Mom.
00:27:32Mom, why did you die?
00:27:42Remi.
00:27:45Are you okay?
00:27:46Cheer up.
00:27:49Everyone makes mistakes.
00:27:53Look at me.
00:27:54You always make mistakes.
00:27:58Remi, don't laugh.
00:28:05I don't look like my mom.
00:28:07What?
00:28:10Your mouth is moving.
00:28:15What?
00:28:23When I don't like something, my husband helps me.
00:28:29When I don't like something, my husband helps me.
00:28:36When I drink alcohol, I can't control myself.
00:28:43Do you have any evidence?
00:28:46I tried to hide it, but I found it.
00:28:54I'm going to divorce soon.
00:28:57Can I get a divorce certificate?
00:29:01Do you have any recent videos?
00:29:05What SNS?
00:29:08I have a video of me drinking with my friends.
00:29:14You're stupid.
00:29:18It's a suicide to show your face on SNS.
00:29:27I'm sorry.
00:29:36What's going on?
00:29:41Don't cry.
00:29:45I'm sorry.
00:29:46You don't look like you're sad.
00:29:52Really?
00:29:54I'm not good at expressing my feelings.
00:30:09This is a video that became a hot topic on SNS two years ago.
00:30:18Then, in just one year, AI was able to create a video that looked exactly like the real thing.
00:30:31Originally, AI was developed from writing, and now it is used in various fields such as medicine and education.
00:30:40And it is thought that the video industry will be the most active in the future.
00:30:53This company in the United States independently develops video-generating AI.
00:31:00They want to make high-quality videos like Hollywood movies with low budget.
00:31:10They want to generate videos of humans really well, since that's such a big part of creating cinematic content.
00:31:17It's also really great at character emotions, so really being able to describe shots of people and really capture their emotion.
00:31:26They want to create a video of a woman crying.
00:31:30There are challenges, such as not being able to shed tears.
00:31:37But they find improvements and keep updating.
00:31:42There's always more things to improve to get that additional realism.
00:31:46So, in a few years, we're going to be able to get to a feature-length film.
00:31:51So, being able to generate a full story over the course of, you know, two years.
00:31:56Seisei AI is trying to make a big difference in our future.
00:32:26It was a long time ago.
00:32:30But it brings back memories.
00:32:33Remi-chan, you've gotten a lot better.
00:32:38Yeah.
00:32:40I think I was worried because of the change in the environment.
00:32:45Maybe that's why I'm a little lost.
00:32:51Remi-chan!
00:32:53Let me see.
00:32:54It's okay.
00:32:56Here.
00:33:09Um...
00:33:14Um...
00:33:15My sister, Yoko.
00:33:19Ah, Yoko!
00:33:21You've grown up.
00:33:25You look just like your sister.
00:33:30I've been updating my app, and I can even see you talking.
00:33:35It's like...
00:33:38You've come back to life.
00:33:41The more realistic it gets, the more...
00:33:45You're the only one who feels that way.
00:33:54Onii-san.
00:33:57Can you use that?
00:34:06Please face forward and align your face with the frame.
00:34:13When an AI is ordered by a machine, it's so rude.
00:34:17Unable to identify.
00:34:18Don't laugh.
00:34:19Do it properly.
00:34:21Please face forward again.
00:34:25Please face diagonally to the right.
00:34:27This way, this way.
00:34:30Identified.
00:34:31Ah!
00:34:34It's a work email.
00:34:36Did you take a copy?
00:34:47I told my sister to send customer data.
00:34:50It's SEM, right?
00:34:54I don't know.
00:34:58It's written here, right?
00:35:00Send customer data to the head office in New York.
00:35:05There's more evidence.
00:35:07SEM sent this email to my sister from New York.
00:35:10And told her to attend the meeting.
00:35:13My wife recorded the meeting.
00:35:20Sorry to keep you waiting.
00:35:21Ah, I'm sorry, Sato-kun.
00:35:23Is it okay to send customer data for cancer insurance?
00:35:27Yes.
00:35:28The head office will analyze the characteristics of the patients.
00:35:30Can we see Japanese customer data from here?
00:35:35You can't access it across the country.
00:35:37I see.
00:35:38Then, can you send it to the head office?
00:35:42But to send customer data, we need a certificate.
00:35:45I'll do that.
00:35:47Please prepare the customer data.
00:35:49Okay.
00:35:50I understand.
00:35:51Then, I'll send you the link to the shared folder that the head office uses.
00:35:54Put it in there.
00:35:56Okay.
00:35:57I'll do it right away.
00:36:12So, after golf?
00:36:14There will be a dinner with Mr. Sasaki from 5 p.m.
00:36:17Shabu-shabu, right?
00:36:19It's our usual place.
00:36:20You like it, don't you, Sasaki?
00:36:22Chief!
00:36:25Why are you here?
00:36:27You're supposed to be in America.
00:36:29Ah, my business trip got delayed.
00:36:32I'm going to play golf.
00:36:34What?
00:36:35What?
00:36:38Didn't I tell you to send customer data?
00:36:44Who?
00:36:45I don't know.
00:36:50But...
00:36:52In the shared folder...
00:36:54Shared folder?
00:36:56What?
00:36:57What's wrong?
00:37:01You can't access it across the country.
00:37:03Hey, you!
00:37:05Why am I here?
00:37:07I don't know about this meeting.
00:37:10This is...
00:37:12Deepfake fraud, isn't it?
00:37:15What?
00:37:16Deepfake?
00:37:19Someone set us up to deceive you.
00:37:35I'll send you the link to the shared folder.
00:37:39Put it in there.
00:37:41Yes, I'll talk to him right away.
00:37:45So...
00:37:46What's the customer data you sent me?
00:37:50It includes basic information such as name and insurance number...
00:37:54...as well as the company's account and medical records.
00:38:00What about the damage caused by publicizing it to the media?
00:38:03Well...
00:38:04According to the past robbery cases...
00:38:08...only the payment to the customer...
00:38:10Only the payment to the customer...
00:38:14...can cause this much damage.
00:38:19One billion yen?
00:38:22It's not just that.
00:38:24Considering the damage caused to the customer...
00:38:27...and the claimant...
00:38:29...the damage could be even greater.
00:38:34Prepare for the press conference.
00:38:36But...
00:38:37How can we prove that this is Deepfake?
00:38:41If we publicize it without knowing anything...
00:38:43...we might be attacked.
00:38:45Then what should we do?
00:38:48I'm sorry.
00:38:50Then what should we do?
00:38:57Um...
00:38:59Give me some time.
00:39:01Let me find out who did this and why.
00:39:05Please!
00:39:06What can I do for you?
00:39:08Please!
00:39:10Please.
00:39:13Two weeks later...
00:39:15...I received a call from Sato.
00:39:20I checked her computer...
00:39:23...but there was nothing related to the criminal.
00:39:27That's why...
00:39:29...I only publicized the information.
00:39:33Are you going to just let it go?
00:39:36Then...
00:39:40...should I be honest...
00:39:44...and say that your wife was behind it?
00:39:48What should we do?
00:39:50If we publicize everything...
00:39:54...my sister won't be able to avoid being bashed.
00:39:58That's why we can't pretend it didn't happen.
00:40:01Then...
00:40:02...what should we do?
00:40:09I know why Manami came to Deepfake.
00:40:14What do you mean?
00:40:15Manami was convinced that Deepfake was created by Deepfake.
00:40:20That's your imagination, isn't it?
00:40:24There's no proof.
00:40:31Two weeks later...
00:40:35You bullied her at school?
00:40:37Yes.
00:40:40I consulted with the school...
00:40:42...but they didn't listen to me.
00:40:55Who bullied you?
00:40:58This girl...
00:41:00...and this girl...
00:41:02...and this girl.
00:41:05Is there anyone else who uses social media?
00:41:10This girl.
00:41:15What's your name?
00:41:17Hinako Kawada.
00:41:19Hinako Kawada
00:41:24So...
00:41:25...what do you want?
00:41:28Do you want proof?
00:41:31Or...
00:41:32...do you want revenge?
00:41:49Thank you very much.
00:41:56If you want revenge...
00:41:58...it's the biggest damage to be spread on the internet.
00:42:11I was kidding.
00:42:12You shouldn't joke.
00:42:15Sato-san.
00:42:16Yes?
00:42:18Your daughter was in the bullying group, wasn't she?
00:42:24She's waiting for her phone.
00:42:28Why don't you talk to her?
00:42:35Can I ask you something?
00:42:38I want to ask you about Gai.
00:42:41Yes.
00:42:43The video editing app that is commonly used...
00:42:47...shouldn't be able to make videos of naked women or sexual content.
00:42:52Why can it be made?
00:42:56It's not the AI that decides what to edit.
00:43:00It's the human being who decides the ethics and programs it.
00:43:04In other words, the app doesn't have any ethics?
00:43:10No.
00:43:13That's what it is.
00:43:21Now...
00:43:22...all over the world, including Korea, Japan, and the United States...
00:43:27...deepfake videos of naked women are spreading.
00:43:32There aren't many cases where the victim is underage.
00:43:35Most of the victims are teenagers.
00:43:41Many of these deepfakes are made by a messenger app with high anonymity.
00:43:50Some advertise that this tool can create sexual content that is prohibited by other apps.
00:44:00Pete Nicoletti is a technology leader at an international cybersecurity company.
00:44:09Recently, deepfake transactions have increased in a special internet space called the dark web.
00:44:30So this company is out there selling his services to anybody that wants to buy them.
00:44:38The dark web can't be accessed without a special browser.
00:44:43It's hard to identify the person who accessed it.
00:44:48Because fake videos made by the dark web are not prohibited...
00:44:53...famous people's videos and videos to deceive people are also created.
00:45:18Last year, a foreign company in Hong Kong that sells British cars...
00:45:22...became the target of a fraud using deepfakes.
00:45:27A staff member was called to a fake remote meeting...
00:45:31...and instructed to send money to the person who became the boss.
00:45:38The staff member believed this and sent a large amount of money.
00:45:42The amount of money was about 4 billion yen in Japanese yen.
00:45:51Deepfake technology has the potential to help people...
00:45:55...but if it is used for crime, it can even lead people to destruction.
00:46:11And we're also having to deal with artificially intelligence created deepfakes.
00:46:41Is that so?
00:46:43Why are you ignoring the recipe?
00:46:47It's not fun to follow the recipe.
00:46:50I'm home.
00:46:51Welcome home.
00:46:52What are you doing?
00:46:54I'm making a cake with my mom.
00:46:56You're making it?
00:47:02A little more.
00:47:03A little more?
00:47:07It's coming.
00:47:08Stop it.
00:47:09What are you doing?
00:47:12I don't want you to depend on that.
00:47:14I don't get it.
00:47:15It's an AI-made mom, not a real mom.
00:47:20It doesn't matter. If you think I'm a mom...
00:47:22I'm not a mom!
00:47:26Remi, sit here.
00:47:30Hurry.
00:47:35Sit down.
00:47:40You know Aoi Amamiya, right?
00:47:43Her classmate.
00:47:45She came to my office with her mom today.
00:47:49She's being bullied at school.
00:47:52Can you do something about it?
00:47:55Is it true that Remi is being bullied?
00:47:58No, I'm not being bullied.
00:48:00But she said she was being bullied.
00:48:02That's a lie.
00:48:05Because I'm the one being bullied.
00:48:08It's a lie.
00:48:09A lie?
00:48:22What is this?
00:48:24They make a lot of these at school.
00:48:29Why don't you ask your dad about this?
00:48:32Because he didn't want to listen to me.
00:48:36That's why I asked my mom.
00:48:38It's not fair that you're watching this now.
00:48:41My mom would have believed me!
00:48:52So it's the other way around?
00:48:54Amamiya-san is the one being bullied.
00:48:58What's the proof?
00:49:02She came to my office today.
00:49:05It seems like this kind of bullying is popular.
00:49:09I'm angry.
00:49:12But you should stay out of this case.
00:49:17It'll be troublesome if the perpetrator gets involved.
00:49:22Yes.
00:49:24These kids are scary.
00:49:28They lie with a calm face.
00:49:31I'm sorry. I'll do my best.
00:49:34I'll do my best.
00:49:47Why are you being bullied?
00:49:55You made this, didn't you?
00:49:57I'll give it back to you.
00:50:01Stop it!
00:50:02Stop it!
00:50:04Stop it!
00:50:09Look!
00:50:27Hello?
00:50:29Yes, I'm always here for you.
00:50:37Yes, a video?
00:50:42Stop it.
00:50:43I'm waiting for you.
00:50:45Stop it.
00:50:47Stop it.
00:50:49Stop it!
00:50:50Stop it!
00:50:51Stop it!
00:50:52Stop it!
00:50:55It suits you.
00:50:56It suits you.
00:50:57Stop it.
00:50:58Stop it.
00:51:11What do you mean?
00:51:12What?
00:51:14You made this video, didn't you?
00:51:20This?
00:51:21Yes.
00:51:22I got a call from Amamiya's mother.
00:51:25She asked me to make a proof of bullying and upload it.
00:51:27But my daughter didn't bully anyone.
00:51:31You saw that zombie video, didn't you?
00:51:33But it seems that the liar is your daughter.
00:51:37What?
00:51:38This video was made by Gai.
00:51:42Metadata that can't be seen is embedded in the video made by Gai.
00:51:47If you use this analysis software, you can find out who made it at once.
00:51:51Analysis software?
00:51:53Only some people in the company have it.
00:51:57Recently, many people have asked me to find out if it's a deepfake.
00:52:05Isn't this Sato-san's serial number on this computer?
00:52:09No way.
00:52:10He didn't say that he made it with me, did he?
00:52:23I'm home.
00:52:32I found my father's computer.
00:52:35I found it.
00:52:37I'm sorry for messing with my father's computer.
00:52:44But it's true that I'm being bullied.
00:52:51Why did you make such a zombie-like video?
00:52:55Because I can't show you the video he made.
00:53:06The actual video that was used to bully Demi-chan...
00:53:15It's a video of Demi-chan being naked.
00:53:36Demi-chan is naked.
00:53:41I'm not naked.
00:53:43What? Are you naked?
00:53:47No matter how hard I try, I can't replace Manami.
00:54:01No.
00:54:02It's not the time to say this.
00:54:05I'm the only one he has.
00:54:11You look cool today.
00:54:17Are you trying to comfort me?
00:54:20I found out.
00:54:21Did you find out?
00:54:24At times like that, you should be more serious and put more pressure on me.
00:54:30Okay, I learned.
00:54:36I hope Manami is alive.
00:54:40She's alive.
00:54:45Look, she's right here.
00:54:49She's right here.
00:54:56But once you go through the analysis software, you'll know she's an A.I.
00:55:02Analysis?
00:55:03Yes.
00:55:04Analysis software?
00:55:05Once you go through it, you'll know she's an A.I.
00:55:07I'm scared.
00:55:10I'm scared.
00:55:19What's wrong?
00:55:22Nothing.
00:55:23Oh, you're too...
00:55:54What?
00:56:12Mr. Sato.
00:56:14Is that Ms. Tsubukura?
00:56:16Yes.
00:56:18She asked me to put in the data.
00:56:21That's a lot of data.
00:56:51Oh, I'm sorry.
00:56:53Mr. Sato was messing with Ms. Tsubukura's computer, but he asked me to do something.
00:56:57Really?
00:56:58Yes.
00:57:10Excuse me.
00:57:11Yes?
00:57:12Can I ask you something?
00:57:13What is it?
00:57:14Can I ask you something?
00:57:19Later.
00:57:20Later.
00:57:43What are you doing on my computer?
00:57:53It was you, wasn't it?
00:57:55What?
00:57:57You tricked my wife.
00:58:04Your wife?
00:58:07Oh, that's him.
00:58:08Mr. Sato.
00:58:14Excuse me.
00:58:25How is she?
00:58:36She committed suicide.
00:58:40Suicide?
00:58:42I didn't know that.
00:58:47Why did you trick her?
00:58:50I didn't mean to.
00:58:53I happened to get the company's employee's name.
00:58:57The information of the big company that was hacked is often circulated on the Internet.
00:59:02Eiji Katahira.
00:59:04Eiji.
00:59:07The information of the big company that was hacked is often circulated on the Internet.
00:59:11It's easy to make a deepfake.
00:59:22Well.
00:59:24Can you send it to me?
00:59:27But I can't send customer data.
00:59:30I'll do it.
00:59:31Please prepare the customer data.
00:59:34Okay.
00:59:35I'll send you the link of the shared folder that the company uses.
00:59:38Please put it in there.
00:59:39Yes.
00:59:40I'll do it right away.
00:59:45She came here, didn't she?
00:59:48Yes.
00:59:50I was surprised, too.
00:59:52Your wife suspected this company.
00:59:54What?
00:59:56She wanted me to help her.
01:00:02Do you know if these executives are deepfakes?
01:00:09I can check it out.
01:00:12Really?
01:00:13Yes.
01:00:16I'll also check the whereabouts of the customer data.
01:00:23Please.
01:00:24Yes.
01:00:26I'm sorry to say this.
01:00:30I analyzed the video of the remote meeting.
01:00:35It was so real that I couldn't tell if it was a fake.
01:00:41I see.
01:00:45I was able to check the customer data on the dark web.
01:00:50Dark web?
01:00:53Dark web?
01:00:55Yes.
01:00:59There is a site that specializes in personal information on the dark web.
01:01:04Here is the trace of buying and selling customer data.
01:01:23Can I get it back?
01:01:26I don't care how much it costs.
01:01:30Please get the customer data back.
01:01:34Please.
01:01:36I'm sorry.
01:01:38If it's on the dark web, I'll never be able to get it back.
01:01:43Please.
01:01:54Mr. Sato.
01:01:59Do you know how much personal information a health care company has?
01:02:07$30 per person.
01:02:09That's the market price on the dark web.
01:02:11If it's for 230,000 people, it's about 1 billion yen in Japanese yen.
01:02:16If you have that money, you can save more people.
01:02:20What's wrong with that?
01:02:23What are you doing?
01:02:25If you had returned the customer data, you wouldn't have died.
01:02:34I really don't know who took the data.
01:02:42People are looking for a fake.
01:02:46We can save the weak.
01:02:49You've seen it, haven't you?
01:02:55I'm looking forward to it.
01:02:58Can you do that?
01:03:01Do you remember this place?
01:03:03Do you remember this place?
01:03:08To be honest, no one will help me.
01:03:12Then you can answer that you want me to help you because it's a lie.
01:03:18It's a lie.
01:03:20What's the point of a life saved by a lie?
01:03:26I'm sorry.
01:03:28I'm sorry.
01:03:30What's the truth?
01:03:33It's a time when people change.
01:03:37I'll push you out to the police.
01:03:43It's over.
01:03:46You can't deceive me.
01:03:47You can't deceive me.
01:03:55Suspect Tatsuya Tsubokura has just come out.
01:04:04The police are investigating whether there is a criminal record for the arrest of suspect Tsubokura.
01:04:15According to the investigators, the suspect is insisting that he wanted to bully a large corporation that exploits the people.
01:04:25It is believed that the number of frauds using deepfake will exceed 5,000 this year and the damage will be the highest in the past.
01:04:32There is a risk that a new crime using deepfake will occur if the personal information that has been bribed is handed over to another criminal.
01:04:46How should we respond to this age of fake where everyone becomes a victim?
01:05:02No, you can't.
01:05:05You have to use artificial intelligence to detect artificial intelligence threats.
01:05:12The development of technology that overlooks deepfake is now being carried out all over the world.
01:05:22This venture company in the United States is one of the companies that is world-famous for detecting deepfake.
01:05:30So we see the real and fake.
01:05:34So we see a green box around the real person, red box around the fake person.
01:05:42Video and images are read into the analysis software and analyzed in detail.
01:05:49It is said that the contents created by artificial intelligence have characteristics in color and shape.
01:05:55This is detected by an AI for analysis to determine whether it is fake or not.
01:06:02But deepfake is evolving fast.
01:06:05It is said that there is a high possibility that things that cannot be detected by analysis will appear in the future.
01:06:24The regulations are needed to help protect us from a threat that we just can't solve ourselves.
01:06:37The demand for deepfake regulations is increasing all over the world.
01:06:41In Korea, in September of last year,
01:06:45a bill was passed to prohibit the personal viewing and possession of sexual content created by deepfake.
01:06:56The EU also plans to impose strict regulations on deepfake content in the future.
01:07:02And now, Japan is also moving towards legalization.
01:07:32The world where we actually believed things before, now they're changing.
01:08:03I'm worried about you.
01:08:09Thank you so much, Yoko.
01:08:15I'm counting on you.
01:08:19Let's go, Emi.
01:08:21See you.
01:08:22See you.
01:08:30Hey.
01:08:32Today is your birthday, right?
01:08:35Yes, it is.
01:08:38I'll make you a cake when I get back.
01:09:10Hey.
01:09:13What's wrong?
01:09:15Are you depressed again?
01:09:17I was going to say goodbye today.
01:09:22What?
01:09:24I want to face you properly.
01:09:29I'm not a good father like you.
01:09:32So I want to face you with all my strength.
01:09:42What do you mean?
01:09:43If I don't live a life without Manami, neither do I and you.
01:10:05I'm here.
01:10:08You believed me.
01:10:11Why did you say that?
01:10:20Manami.
01:10:27You're dead.
01:10:40I'm sorry.
01:10:44I'm sorry.
01:10:47I'm really sorry.
01:10:59Goodbye, Manami.
01:11:10Are you sure?
01:11:40Mom.
01:11:46Mom, hi.
01:11:49What's up?
01:11:51It's obvious.
01:11:53Emi, that's...
01:11:59It's a secret from dad.
01:12:10To be continued.
01:12:13Hey, everyone.
01:12:15I have a favor to ask you.
01:12:19Is that e-mail really okay?
01:12:21We need to verify your identity to use the service.
01:12:26We need your personal information.
01:12:29Please be careful of the link posted on the e-mail.
01:12:32Stop fraud.
01:12:34Don't be fooled by us.
01:12:36I'm Mirei Kusanagi.
01:12:40I love you.
01:12:42I love you.
01:12:45I love you.
01:12:47I love you.
01:13:11NHK special.
01:13:14What?
01:13:16How to live the next 100 years from the appearance of people who continue to struggle.
01:13:2122nd at 10 p.m.
01:13:28Fujii Kaze's 5th anniversary.
01:13:32Looking back at the 5 years of NHK's record.
01:13:41VTuber special.
01:13:44I'm the host, Hoshimatsu Suisei.
01:13:47Broadcast at 11 p.m. on March 22.
01:13:50NHK's spring recommendation.
01:13:53Broadcast at 7.30 p.m. on April 5.
01:13:57The new series, Isejingu.
01:14:00Broadcast at 7.57 p.m. on April 2.
01:14:03The new series, Isejingu.
01:14:06Broadcast at 7.57 p.m. on April 2.
01:14:08Be sure to watch the show for more!
01:14:15Broadcast at 9 a.m. on April 5.
01:14:18It's a program that allows kids and adults to enjoy news.
01:14:22This spring, NHK.
01:14:26Human 연스.
01:14:28Water is the theme.
01:14:30Water is a special substance.
01:14:32It's the source of life.
01:14:34I didn't even think water could move.
01:14:38See you next time!

Recommended