• 1 hour ago
NHKスペシャル 2025年3月18日 創られた“真実” ディープフェイクの時代
#EnglishMovie #cdrama #drama #engsub #chinesedramaengsub #movieshortfull

Category

📺
TV
Transcript
00:00:00We have confirmed that important information about the three employees has been leaked.
00:00:11We deeply apologize for the inconvenience and concerns we have caused to you.
00:00:20We are very sorry.
00:00:22We are very sorry.
00:00:24How many people's data were leaked?
00:00:28As far as we know, there are about 200,000.
00:00:33What is the probability of being hacked?
00:00:35Hasn't there been a countermeasure before?
00:00:40We have completely abandoned security measures.
00:00:44Sunshine Life, a foreign insurance company that owns a Japanese company in the United States,
00:00:48announced today that 230,000 customers' data has been leaked.
00:00:53There is a risk that the leaked data will be misused,
00:00:56and the contractor has raised concerns.
00:01:04Please tell us the details of the leaked data.
00:01:06When will the results be announced?
00:01:09We will report as soon as we have progress.
00:01:11Please open the door.
00:01:12What do you think about the customer's insurance?
00:01:14President, please answer!
00:01:17In recent years,
00:01:19companies have been cyber-attacked,
00:01:21and personal information has been seized.
00:01:29These cyber-crimes have led to deepfakes.
00:01:40Deepfake means using AI to create images and videos that look exactly like the real thing.
00:01:49However, these days,
00:01:51it is often used to deceive or intimidate others.
00:02:00According to a survey conducted last year by a security company that avoids fraud,
00:02:05it was found that the number of fraudulent accesses to companies increased by 3 million,
00:02:10while the number of deepfakes increased by 210,000,
00:02:14which is four times more than last year.
00:02:19The question is, what's going to happen when deepfakes spread?
00:02:22It's already happened.
00:02:25And this year,
00:02:27it is predicted that the damage caused by deepfakes will increase even more.
00:02:35The background of this is the existence of fake videos,
00:02:39which are becoming more and more realistic as AI advances.
00:02:45We live in a post-real society
00:02:48where you can't tell if it's real or if it's fake.
00:02:55Last year,
00:02:56a global company based in the UK
00:02:59was deceived by a deepfake and lost 400 million yen.
00:03:05It shook the world.
00:03:11This time,
00:03:12we interviewed the deepfake incidents that actually happened around the world
00:03:17and their background,
00:03:20and dramatized what was happening.
00:03:24Isn't this a deepfake fraud?
00:03:29This story is a factual drama of the near future that depicts the era of deepfakes.
00:03:43Remy!
00:03:46Remy!
00:03:47Breakfast is ready!
00:03:52You're going to be late on the first day.
00:03:54Remy!
00:03:56Hey!
00:04:00Breakfast!
00:04:06Good morning.
00:04:07Hurry up and eat.
00:04:09I don't want it.
00:04:10What?
00:04:12I don't want it.
00:04:22Hurry up.
00:04:23Hurry up and eat.
00:04:25Just eat a little bit.
00:04:33Is that e-mail really okay?
00:04:36The link to the e-mail is in the description.
00:04:39Deepfake!
00:04:40Fraud!
00:04:41We won't be fooled!
00:04:43I'm Mirei Kusanagi!
00:04:45Meow meow!
00:04:54Good morning.
00:04:55Good morning.
00:05:00Here's your coffee.
00:05:03Good morning.
00:05:04Good morning.
00:05:05Good morning.
00:05:08The main office is over there.
00:05:14This is the main office.
00:05:16There are about 50 staff members.
00:05:19It's a big deal for a three-year venture.
00:05:23Good morning, everyone.
00:05:25If you have time, please come to the hall.
00:05:30I'm Akira Sato, a new employee.
00:05:35I used to be a bank salesman.
00:05:37Please follow me until you get used to it.
00:05:40Sato-kun.
00:05:43I'm Akira Sato.
00:05:48I strongly sympathize with the company's idea of solving people's problems with the power of AI.
00:05:59I've changed my job.
00:06:01I'm looking forward to working with you.
00:06:06Mr. Shinji.
00:06:08Please answer the phone.
00:06:13Mirei Kusanagi?
00:06:16Fukayama.
00:06:17Stop using Deepfake.
00:06:19Okay.
00:06:20I'm sorry.
00:06:24I'm looking forward to working with you.
00:06:28Ask the chief, Tsubokura.
00:06:32Tsubokura.
00:06:33Yes.
00:06:38Nice to meet you.
00:06:47Good morning.
00:06:48Good morning.
00:06:49Good morning.
00:06:52I'll teach you how to make videos with AI.
00:06:54Get used to it.
00:06:56Okay.
00:06:58First of all,
00:07:01look for a voice asking for help on social media.
00:07:06For example...
00:07:15This is good.
00:07:16It's asking for a pet.
00:07:18Okay.
00:07:19I'll ask Sato-san to spread this post.
00:07:24I'm sorry. How?
00:07:27I'll use AI to make a video and re-post it.
00:07:33I'll use an app called Gai.
00:07:36Gai?
00:07:37Yes, it's our original app.
00:07:42Roll the ball and throw it back.
00:07:49I'll tell you how to move the photo.
00:07:54And then...
00:08:03Oh, it moved.
00:08:06It's better to have an emotional video like this.
00:08:15I'll re-post this.
00:08:24That's amazing.
00:08:26If we get more views, we'll get more advertising revenue.
00:08:30It's a little scary, but...
00:08:33Our goal is to get people to learn how to make videos.
00:08:40But it's a fake, right?
00:08:43That's not the point.
00:08:45The point is how to get attention.
00:08:47And if you spread it a lot, the owner will be happy.
00:08:52I see.
00:08:55Good luck.
00:09:12Are you okay?
00:09:15Yes.
00:09:17It's easy once you get the hang of it.
00:09:19It's easy once you get the hang of it.
00:09:22I didn't know you could do this.
00:09:26He's an AI expert.
00:09:29He was amazing at the last screening.
00:09:32Oh, the reunion of the parents and children.
00:09:35Yes.
00:09:36He's amazing.
00:09:37Do you want to see it?
00:09:41Here you go.
00:09:49I'll show you.
00:10:04Is he an AI, too?
00:10:05No.
00:10:06His mother and background are all AI.
00:10:09I put a link to the donation site in this video.
00:10:12The response was amazing.
00:10:13I think we got more than 100 million views.
00:10:16Really?
00:10:17The people at the donation site were very happy, too.
00:10:19Yes.
00:10:21Mr. Tsubokura, it's easy to make this.
00:10:33Isn't it this child?
00:10:37The dog from earlier?
00:10:40Oh, I found it.
00:10:43That's fast.
00:10:44I'm glad.
00:10:46It's like this.
00:10:59I'm home.
00:11:00Welcome home.
00:11:05Oh, Yoko.
00:11:06I'm sorry to bother you.
00:11:09I'm here to scout the new house.
00:11:11Welcome.
00:11:12What's this? Curry?
00:11:13It smells good.
00:11:14That's my sister's belongings from the office.
00:11:20Thank you for coming.
00:11:21You're welcome.
00:11:31I was surprised that you were going to get a new job.
00:11:35How was it?
00:11:36So, how was it?
00:11:42I heard that you help people with AI.
00:11:46You're using deepfake, aren't you?
00:11:49Deepfake?
00:11:51I make fake images and videos using AI.
00:11:56You know a lot.
00:11:58I learned it in class today.
00:12:01I didn't know you learned it.
00:12:03It looks delicious.
00:12:06I wonder if it's good.
00:12:09I'm going to eat in my room.
00:12:18I've always been like this since Manami died.
00:12:24Lunch?
00:12:28I'm going to take a break from work tomorrow.
00:12:33Did something happen at work?
00:12:36No.
00:12:40Tell me if something happens.
00:12:46I'm sorry. I can't tell you now.
00:12:50You can't?
00:12:53Anyway, it's okay.
00:12:59Okay.
00:13:07I'm home.
00:13:13Manami?
00:13:23Manami?
00:13:36Manami?
00:13:42Manami?
00:13:45Manami?
00:13:49Manami?
00:13:57Dad.
00:14:00This is my company, isn't it?
00:14:07Yoko!
00:14:10Yoko!
00:14:12Yoko!
00:14:13Brother.
00:14:14I'm sorry.
00:14:16I saw the news.
00:14:19Manami is involved in customer information.
00:14:26I haven't received such information.
00:14:30I don't even know when and how.
00:14:34Really?
00:14:36I can't think of any other reason.
00:14:38But...
00:14:40It's been four months since then.
00:14:43I still don't know why the customer information was leaked.
00:14:50DIPLEX?
00:14:56Manami came to DIPLEX twice three days before she died.
00:15:06I'm sure something happened to her.
00:15:09Is that why she got a new job?
00:15:13The customer information didn't have Manami's name on it.
00:15:18I'll keep looking into it.
00:15:20Tell me if anything happens to Yoko.
00:15:23Okay.
00:15:26Is this your sister's personal computer?
00:15:28Yeah.
00:15:29Maybe there's a clue.
00:15:31Well...
00:15:33It has a face recognition system.
00:15:36Face the wall facing the front.
00:15:39What about the picture?
00:15:41I tried it.
00:15:43It didn't work.
00:15:51Is this your new department?
00:15:53Yes.
00:15:54I'm done with my internship.
00:15:56I'm going to talk to you about my problems.
00:15:58Okay.
00:16:00Bye.
00:16:08You must be tired.
00:16:09I'm fine.
00:16:10I was worried about you.
00:16:13Mom, I'm sorry.
00:16:16I was sick and was hospitalized.
00:16:21I know.
00:16:23But I'm fine now.
00:16:25Don't worry about me.
00:16:26Ms. Sukuro.
00:16:28That's her voice.
00:16:30I changed it to the voice of a Kansai dialect man.
00:16:34And her face.
00:16:35What?
00:16:36Look at this.
00:16:46Deepfake?
00:16:48This man is my grandma's son.
00:16:52She's my granddaughter.
00:16:54So, your granddaughter is pretending to be your son?
00:17:01My grandma's son died in a car accident two weeks ago.
00:17:06But my granddaughter couldn't tell her.
00:17:10Why?
00:17:11She has a heart disease.
00:17:15If she hears that her son is dead, she'll be shocked.
00:17:21If she can see her son's happy face, she'll be relieved.
00:17:26Come in.
00:17:36I'm in the hospital.
00:17:38It's comfortable.
00:17:42But I'm lonely.
00:17:45Can't you come here?
00:17:49I don't think you can live long.
00:17:54Think of me as your last.
00:17:57Come here.
00:18:02What's wrong, Yosuke?
00:18:04Are you crying?
00:18:07What's this?
00:18:10Take it.
00:18:12I have an important job.
00:18:15So, please wait a little longer.
00:18:17Okay.
00:18:19I'm sorry for being selfish.
00:18:23I'll let you go to Suzuka.
00:18:30I'm looking forward to it.
00:18:35Take care.
00:18:40Kimie, it's time to go.
00:18:47Are you drunk?
00:19:06Are you okay?
00:19:08I'm fine.
00:19:11Your grandmother was happy.
00:19:13I'm glad.
00:19:15But I can't lie forever.
00:19:21Come to me anytime.
00:19:26Let's go.
00:19:33Excuse me.
00:19:39How was it?
00:19:43I understand the situation.
00:19:45But you lied to your grandmother.
00:19:49Are you still saying that?
00:19:51Deepfake is useful for people.
00:19:54Otherwise, it's hard to work here.
00:20:08The story of Kimie encouraging her grandmother to revive her dead father was actually in China.
00:20:18I'm Jihan.
00:20:20I'm in Beijing.
00:20:27Now, in online shops in China,
00:20:30services to revive dead relatives with AI are becoming popular.
00:20:39There are companies all over the world that use deepfake technology for business.
00:20:49In this company in the UK,
00:20:52they train real-life people with AI
00:20:55and create avatars on computers.
00:21:02We have over 200 of them on the platform.
00:21:05An office worker, if you need a factory worker.
00:21:08And then we offer personal avatars.
00:21:10Other people can make personal avatars of themselves.
00:21:15The avatars are mainly used for business introduction videos and internet ads.
00:21:24The words you want the avatar to speak are entered in sentences.
00:21:29Sentences can be converted into 130 languages.
00:21:36Hi, I'm AI avatar of Alex Wojka,
00:21:39and I'm so excited to be here with you today.
00:21:42Hello, I'm Alex Wojka's AI avatar.
00:21:46I was created using generative AI a few months ago in the studio.
00:21:54That's pretty good Japanese.
00:21:56Very good, Japanese.
00:21:59I've practiced a lot.
00:22:02The technology to make avatars speak in real-time will be completed this year.
00:22:10Japan does not have any legal regulations against this technology.
00:22:17The tremendous evolution of AI is starting to shake our sense of ethics.
00:22:23So, prompts are used to give instructions or ask questions to the AI.
00:22:31The clearer and more specific the prompt is,
00:22:35the more the AI will tell you the right answer,
00:22:38or generate what you want it to generate.
00:22:42This prompt can be made by the AI,
00:22:45but it can also be generated by humans.
00:22:49This prompt can also be made by the AI,
00:22:53but it's better to think about the sentences yourself.
00:22:58The AI will tell you what you want it to say.
00:23:05What's wrong, Sato?
00:23:09Nothing.
00:23:12Are you sure?
00:23:14So, let's practice making prompts.
00:23:40Hey,
00:23:44Hey,
00:23:49How long are you going to read this?
00:23:51I didn't say anything because you were reading it.
00:24:01What's wrong with her?
00:24:13What's wrong?
00:24:16What's wrong?
00:24:44What's wrong?
00:24:46What's wrong?
00:24:48What's wrong?
00:24:50What's wrong?
00:24:52What's wrong?
00:24:54What's wrong?
00:24:57What's wrong?
00:24:59What's wrong?
00:25:01What's wrong?
00:25:03What's wrong?
00:25:05What's wrong?
00:25:07What's wrong?
00:25:09What's wrong?
00:25:11What's wrong?
00:25:14What's wrong?
00:25:34What's wrong?
00:25:36What's wrong?
00:25:37Good morning, everyone.
00:25:39Konomi.
00:25:41Wow!
00:25:43Happy birthday!
00:25:45Surprise!
00:25:47I'm surprised!
00:25:49I'm even more surprised.
00:25:51What is it?
00:25:53I made
00:25:55a cake!
00:25:57Wow!
00:25:59Wait a minute.
00:26:01Is that so?
00:26:03Dad, how is it?
00:26:05It's the best!
00:26:07It's the best!
00:26:09I'm glad.
00:26:11Mom,
00:26:13your eyes are heart-shaped.
00:26:15Oh!
00:26:17I'm so happy.
00:26:21I'm embarrassed.
00:26:35It's been a while.
00:26:47How are you?
00:26:53Wake up.
00:26:59Why are you silent?
00:27:01Can you hear me?
00:27:05Wake up.
00:27:09Mom.
00:27:23Remy.
00:27:25How have you been?
00:27:27Mom.
00:27:31Mom.
00:27:33Why did you die?
00:27:41Remy.
00:27:45Are you all right?
00:27:47Cheer up.
00:27:49Don't cry.
00:27:53Look at me.
00:27:55You always look like a girl.
00:27:57Remy,
00:27:59don't laugh.
00:28:03Don't I look like
00:28:05my mom?
00:28:09Your mouth is moving.
00:28:15What?
00:28:23When my husband
00:28:25doesn't like something,
00:28:27I can do it right away.
00:28:33Especially,
00:28:35when I drink alcohol,
00:28:37I can't control myself.
00:28:41Do you have any evidence?
00:28:43I tried to hide it,
00:28:45but I found it.
00:28:51I'm going to be divorced soon.
00:28:55Can I get a divorce certificate?
00:28:59Ryuji,
00:29:01do you have any
00:29:03recent videos?
00:29:05My husband's SNS.
00:29:09There is a video of him drinking
00:29:11with his friends.
00:29:13You are stupid.
00:29:17What?
00:29:19It's a suicide to show your face
00:29:21on SNS.
00:29:31What?
00:29:35What do you mean?
00:29:45I'm sorry.
00:29:47I can't say
00:29:49you look sad.
00:29:51Is that so?
00:29:55You don't express your feelings well.
00:30:01I see.
00:30:09This is the video
00:30:11that became a hot topic
00:30:13on SNS
00:30:15even though it was made
00:30:17by an AI in 2023.
00:30:21Then, in just one year,
00:30:23AI was able to
00:30:25make a video
00:30:27that looks like a real one.
00:30:31Originally,
00:30:33AI was developed from
00:30:35writing,
00:30:37and now it is used
00:30:39in various fields
00:30:41such as medicine and education.
00:30:43And in the future,
00:30:45the video industry
00:30:47is thought to be
00:30:49the most advanced
00:30:51in its use.
00:30:53This company
00:30:55in the United States
00:30:57independently developed
00:30:59a high-quality video
00:31:01like a Hollywood movie
00:31:03with low budget.
00:31:05It's really trained
00:31:07to generate the videos
00:31:09of humans really well
00:31:11since that's such a big part
00:31:13of creating cinematic content.
00:31:15It's also really great
00:31:17that character emotions
00:31:19are really being able
00:31:21to describe shots of people
00:31:23and really capture their emotion.
00:31:35There are challenges
00:31:37such as not having tears.
00:31:41However,
00:31:43these improvements
00:31:45are being updated.
00:31:48There's always more things to improve
00:31:50to get that additional realism.
00:31:52So, in a few years,
00:31:54we're going to be able to get to
00:31:56a feature-length film,
00:31:58so being able to generate a full story
00:32:00over the course of two hours.
00:32:03AI is trying to
00:32:05change our future
00:32:07in a big way.
00:32:09AI is changing our future
00:32:11in a big way
00:32:13AI is changing our future
00:32:15in a big way
00:32:39AI is changing our future
00:32:41in a big way
00:32:43AI is changing our future
00:32:45in a big way
00:32:47AI is changing our future
00:32:49in a big way
00:32:51AI is changing our future
00:32:53in a big way
00:32:55AI is changing our future
00:32:57in a big way
00:32:59AI is changing our future
00:33:01in a big way
00:33:03AI is changing our future
00:33:05in a big way
00:33:07in a big way
00:33:37I feel like I've come back to life.
00:33:41The more realistic it gets, the more...
00:33:45I'm the only one who feels that way.
00:33:54Brother.
00:33:57Can you use that?
00:34:00What?
00:34:18Don't laugh.
00:34:19Do it properly.
00:34:30You have been identified.
00:34:34It's a work email.
00:34:36Did you copy it?
00:34:47I told my sister to send customer data.
00:34:50It's SEM, isn't it?
00:34:54I don't know.
00:34:57It's written here, isn't it?
00:35:00Send customer data to the head office in New York.
00:35:05There is other evidence.
00:35:07SEM sent this email to my sister from New York,
00:35:10and instructed her to attend the remote meeting.
00:35:13My wife recorded the meeting.
00:35:19I'm sorry to keep you waiting.
00:35:21I'm sorry, Sato-kun.
00:35:23Is it okay to send customer data for cancer insurance?
00:35:27Yes.
00:35:28The head office analyzes the characteristics of the patients.
00:35:31Can I see Japanese customer data from here?
00:35:35You can't access it across the country.
00:35:37Oh, I see.
00:35:39I'll have it sent to the head office.
00:35:42But to send customer data, you need a certificate.
00:35:45I'll do that.
00:35:47Please prepare the customer data.
00:35:49I understand.
00:35:50I'll send you the link to the shared folder the head office uses.
00:35:53Put it in there.
00:35:55Yes, I'll do it right away.
00:36:12After golf?
00:36:14There is a dinner with Mr. Sasaki from 5 p.m.
00:36:17Shabu-shabu, right?
00:36:18Yes, it's our usual place.
00:36:19You like it, don't you, Mr. Sasaki?
00:36:24Why are you here?
00:36:27You're supposed to be in America right now.
00:36:29Oh, I got delayed from work.
00:36:32I'm going to play golf.
00:36:34What?
00:36:35What?
00:36:38Didn't I ask you to send customer data?
00:36:44Who?
00:36:45I don't know.
00:36:46I don't know.
00:36:50But...
00:36:52It's in the shared folder.
00:36:54Shared folder?
00:36:58What's wrong?
00:37:02You can't access it across the country.
00:37:04Excuse me.
00:37:06Why am I here?
00:37:08I don't know about this meeting.
00:37:11Isn't this a deepfake scam?
00:37:16Deepfake?
00:37:19It's a scam.
00:37:36I'll send you the link to the shared folder.
00:37:39Put it in there.
00:37:41Yes, I'll do it right away.
00:37:47What's the customer data you sent me?
00:37:51It contains basic information such as name and insurance number,
00:37:56as well as information about medical records.
00:38:02What's the cost of publicizing it to the media?
00:38:06Based on the past robbery cases,
00:38:10the cost of publicizing it to the public is...
00:38:14about 100 million yen.
00:38:19100 million yen?
00:38:22That's not all.
00:38:24The cost of publicizing it to the public
00:38:27and dealing with the claims
00:38:29will increase even more.
00:38:34Prepare for the press conference.
00:38:36But...
00:38:37how can we prove this is a deepfake?
00:38:41If we publicize it without knowing anything,
00:38:43we might be attacked.
00:38:45Then what should we do?
00:38:48I'm sorry.
00:38:50Then what should we do?
00:38:57Please give me some time.
00:39:01Please let me investigate
00:39:03who did this and why.
00:39:06What can you do?
00:39:08Please!
00:39:10Please.
00:39:13Two weeks later,
00:39:15I received a call that Sato died at home.
00:39:20I checked her company's computer,
00:39:23but I couldn't find anything related to the killer.
00:39:27That's why I only publicized the information.
00:39:33Are you going to just let it go?
00:39:36Then...
00:39:40should I be honest and say
00:39:43your wife was the one who did it?
00:39:48What should we do?
00:39:50If we publicize everything,
00:39:54my sister will be attacked.
00:39:58We can't do what we didn't do.
00:40:01Then...
00:40:02what should we do?
00:40:09I know why Manami came to DeepX.
00:40:14What do you mean?
00:40:15Manami is convinced that DeepX made that deepfake.
00:40:20That's your imagination, isn't it?
00:40:24There's no proof.
00:40:32Two weeks later,
00:40:35You bullied her at school?
00:40:37Yes.
00:40:40I consulted with the school,
00:40:42but they didn't take me seriously.
00:40:55Who bullied you?
00:40:58This girl,
00:41:00this girl,
00:41:02and this girl.
00:41:05Is there anyone who uses SNS?
00:41:10This girl.
00:41:15What's your name?
00:41:17Hinako Kawada.
00:41:19Hinako Kawada
00:41:24So,
00:41:25what do you want?
00:41:28Do you want to make a video of the evidence?
00:41:31Or do you want to get back at her?
00:41:56If you want to get back at her,
00:41:58it's the worst thing you can do.
00:42:11I was just kidding.
00:42:13You shouldn't do that.
00:42:16Sato-san.
00:42:17Yes?
00:42:18Your daughter was in the bullying group, right?
00:42:24She's waiting for you on her phone.
00:42:28Why don't you talk to her?
00:42:35Can I ask you something?
00:42:38I want to ask you about Gai.
00:42:41Yes.
00:42:43You can't make a video
00:42:46with a naked woman, right?
00:42:50Why can't you make a video?
00:42:56It's not the AI that decides what to generate.
00:43:01It's the human that decides the logic.
00:43:05So,
00:43:06the logic isn't built into the app?
00:43:12That's right.
00:43:21Now,
00:43:22in Korea, Japan, and the U.S.,
00:43:26there are many cases of deepfake victims
00:43:29where naked women are processed.
00:43:32The victims are often minors,
00:43:35and the perpetrators are mostly teenagers.
00:43:41Many of these deepfakes
00:43:43are transacted with some of the most well-known messenger apps.
00:43:50So,
00:43:51there were some who advertised that
00:43:54this tool could create sexual content
00:43:57that is prohibited in general apps.
00:44:03Pete Nicoletti,
00:44:05a technology leader at an international cybersecurity company.
00:44:11Recently,
00:44:12deepfake transactions have increased
00:44:16in a special internet space called the dark web.
00:44:21And these are the kinds of websites that we find
00:44:24where you can have deepfakes actually created
00:44:27for 30 seconds for 100 bucks.
00:44:29So, this company is out there
00:44:32selling his services to anybody that wants to buy them.
00:44:38The dark web
00:44:39is hard to access unless you use a special browser.
00:44:43It's hard to identify the person who accessed it.
00:44:48Because fake videos created through the dark web
00:44:51are not prohibited,
00:44:53famous people's videos and videos to deceive people
00:44:57are also created.
00:45:00What these hackers are able to do
00:45:02is with just a minute or two of your voice,
00:45:05I can clone your voice.
00:45:08With a video or a picture of you,
00:45:10I can clone a picture of you
00:45:13and put you in any background that I want.
00:45:18Last year,
00:45:19a foreign company in Hong Kong,
00:45:21which is based in the UK,
00:45:23became a target of a fraud using deepfake.
00:45:27A staff member was called to a fake remote meeting
00:45:31and was instructed to send money to the person
00:45:34who became the boss.
00:45:38The staff member believed this and sent a large amount of money.
00:45:42The amount was about 4 billion yen in Japanese yen.
00:45:51Deepfake technology can be useful to people,
00:45:55but if it is used for crime,
00:45:57it can even lead people to destruction.
00:46:04Here we are living in a post-real society
00:46:08where we're living with real people
00:46:10and we're also having to deal with
00:46:13artificially intelligence created deepfakes.
00:46:36Wait a minute.
00:46:37Didn't you fail because you put in too much last time?
00:46:41Did I?
00:46:43Why are you ignoring the recipe?
00:46:47It's not fun to follow the recipe.
00:46:50I'm home.
00:46:51Welcome back.
00:46:52What are you doing?
00:46:54I'm making a cake with my mom.
00:47:02A little more.
00:47:03A little more?
00:47:07It's almost done.
00:47:08Let's stop.
00:47:10What are you doing?
00:47:13I don't want you to depend on that.
00:47:15I don't get it.
00:47:16It's not the real mom that AI made.
00:47:20It doesn't matter.
00:47:21If I think I'm a mom, I'm not a mom.
00:47:23You're not a mom!
00:47:27Remi, sit here.
00:47:31Hurry.
00:47:35Sit down.
00:47:38Sit down.
00:47:41You know Aoi Amamiya, right?
00:47:43Her classmate.
00:47:46She came to my office with her mom today.
00:47:50She's being bullied at school.
00:47:53Can you help her?
00:47:55Is it true that Remi is being bullied?
00:47:58No.
00:47:59I'm not being bullied.
00:48:00But she said she was being bullied.
00:48:03That's a lie.
00:48:05I'm the one being bullied.
00:48:08That's a lie.
00:48:22What's this?
00:48:23They make a lot of these at school.
00:48:29Why don't you talk to your dad about this?
00:48:32But he didn't want to listen to me.
00:48:36That's why I told my mom.
00:48:38It's not fair to tell her now.
00:48:41If it was my mom, she would have believed me.
00:48:52So it's the other way around?
00:48:54It seems Amamiya is the one being bullied.
00:48:58Any proof?
00:49:02She came to my office with her mom today.
00:49:06She said she was being bullied.
00:49:09I see.
00:49:12But you should stay out of this case.
00:49:17It's not good for you to get involved.
00:49:22Okay.
00:49:24These kids are scary.
00:49:28They make a lot of these at school.
00:49:47Why are you being bullied?
00:49:54You made this, didn't you?
00:49:56You should pay me back.
00:50:01Stop it!
00:50:02Stop it!
00:50:04Stop it!
00:50:05Stop it!
00:50:09Look!
00:50:27Hello?
00:50:30Yes, I'm always here for you.
00:50:37Yes, a video?
00:50:42Stop it.
00:50:43I'm waiting for you.
00:50:45I don't want to.
00:50:47Stop it.
00:50:49Bully! Bully!
00:50:51Bully! Bully!
00:50:55What should I do?
00:50:56It suits you.
00:50:57What should I do?
00:50:58It suits you.
00:51:10What do you mean?
00:51:11What?
00:51:14You made this video, didn't you?
00:51:19This?
00:51:20Yes.
00:51:21You did, didn't you?
00:51:22I got a call from Amamiya's mom.
00:51:24She asked me to upload this video.
00:51:26I told you.
00:51:28My daughter never bullies anyone.
00:51:30You saw the zombie video, didn't you?
00:51:32But it looks like your daughter is the one lying.
00:51:37What?
00:51:38This video was made by Gai.
00:51:42The video was made by Gai.
00:51:44There are hidden metadata in the video.
00:51:47If you use this analysis software, you can find out who made it.
00:51:51Analysis software?
00:51:53Only a few people in the company have it.
00:51:57Recently, many people have asked me to find out if it's a deepfake.
00:52:05Isn't this Sato-san's serial number?
00:52:18No way.
00:52:19You didn't make it together, did you?
00:52:22No.
00:52:38I'm sorry for messing with your computer.
00:52:44But...
00:52:47It's true that I'm being bullied.
00:52:50Why did you make such a zombie-like video?
00:52:54Because I can't show you the video that she made.
00:53:05The actual video that was used to bully Demi-chan...
00:53:14It's a video of Demi-chan being naked.
00:53:19Demi-chan...
00:53:41I'm not naked.
00:53:44What? Are you naked?
00:53:47No matter how hard I try...
00:53:54I can't replace Manami.
00:54:01No.
00:54:02This is not the time to say this.
00:54:05I'm the only one for her.
00:54:07I don't know.
00:54:12Today...
00:54:14You look cool.
00:54:18Are you trying to comfort me?
00:54:20I found out.
00:54:21You found out?
00:54:25At a time like this, you should put more effort into it.
00:54:33Okay, I learned.
00:54:38I hope Manami is alive.
00:54:43She's alive.
00:54:45What?
00:54:47Look.
00:54:49She's right here.
00:54:56But...
00:54:58If you use the analysis software, you'll know she's an A.I.
00:55:02Analysis?
00:55:04Analysis software?
00:55:05If you use it, you'll know she's an A.I.
00:55:10I'm scared.
00:55:21What's wrong?
00:55:35Manami?
00:56:06What?
00:56:12Mr. Sato?
00:56:14Is that Mr. Tsubokura?
00:56:16Yes.
00:56:18He asked me to put in the data.
00:56:21That's a lot of data.
00:56:55Mr. Sato asked me to put in Mr. Tsubokura's computer.
00:57:00Really?
00:57:12Excuse me.
00:57:14Can I ask you something?
00:57:19Later.
00:57:20Later.
00:57:35Later.
00:57:43What are you doing on my computer?
00:57:53It was you.
00:57:55What?
00:57:58You tricked my wife.
00:57:59I'm sorry.
00:58:04I'm sorry.
00:58:07I see.
00:58:08He's Mr. Sato.
00:58:14Excuse me.
00:58:25Is he doing well?
00:58:29Yes.
00:58:36My wife committed suicide.
00:58:40Suicide?
00:58:43I didn't know.
00:58:48Why did you trick her?
00:58:51I didn't mean to.
00:58:54I happened to get the company's employee ID.
00:58:57The information about the company is often shared on the internet.
00:59:02Eiji Katahira.
00:59:07It's easy to create a deepfake because the information is often shared on the internet.
00:59:21I see.
00:59:23Can you send it to me?
00:59:25Yes.
00:59:27But I need the employee ID to send the customer data.
00:59:30I'll do that.
00:59:32Please prepare the customer data.
00:59:34Okay.
00:59:36I'll send you the link of the shared folder that the company uses.
00:59:39Put it in there.
00:59:40Okay.
00:59:41I'll do it right away.
00:59:46She came here, didn't she?
00:59:49Yes.
00:59:50I was surprised, too.
00:59:52Your wife suspected this company.
00:59:56She wants you to help her.
01:00:03Do you know if these executives are deepfakes?
01:00:09I can look into it.
01:00:12Really?
01:00:13Yes.
01:00:16I'll also look into the lost customer data.
01:00:23Please.
01:00:24Okay.
01:00:30I'm sorry to say this.
01:00:34I analyzed the video of the remote meeting.
01:00:38It was too real.
01:00:40I couldn't tell if it was a fake.
01:00:44I see.
01:00:48I was able to check the customer data on the dark web.
01:00:53The dark web?
01:00:55Yes.
01:00:59The dark web has a website that specializes in personal information.
01:01:04Look.
01:01:06This is the trace of the customer data.
01:01:23Can you get it back?
01:01:26I don't care how much it costs.
01:01:30Please get the customer data back.
01:01:34Please.
01:01:36I'm sorry.
01:01:38If it's on the dark web, I'll never be able to get it back.
01:01:43Please.
01:01:54Mr. Sato.
01:01:59Do you know how much personal information insurance companies have?
01:02:07$30 per person.
01:02:09That's the market price on the dark web.
01:02:11If it's for 230,000 people, it's about 1 billion yen in Japan.
01:02:16If you have that money, you can save more people.
01:02:20What's wrong with that?
01:02:23What are you doing?
01:02:25If you had gotten the customer data back, you wouldn't have died.
01:02:34I don't know who took the data.
01:02:42People are looking for a fake.
01:02:46You can save the weak.
01:02:49You've seen it, haven't you?
01:02:55I'm looking forward to it.
01:02:58You can do that?
01:03:01Do you remember this place?
01:03:03Do you remember this place?
01:03:08To be honest, no one will help you.
01:03:12You can answer that you want help because it's a lie.
01:03:18It's a lie.
01:03:20What's the point of a life saved by a lie?
01:03:26It's a shame.
01:03:28It's a shame.
01:03:30What's the truth?
01:03:33It's a time when people change.
01:03:37I'll push you out to the police.
01:03:43It's over.
01:03:46You can't deceive me.
01:03:47You can't deceive me.
01:03:55Suspect Tatsuya Tsubokura has just appeared.
01:04:04The police are investigating whether there is any evidence against the suspect.
01:04:15According to the investigators, the suspect is saying that he wanted to incite a large corporation that exploits the people.
01:04:24It is believed that the number of frauds using Deepfake will exceed 5,000 this year and the damage will be the highest in the past.
01:04:32There is a risk that a new crime using Deepfake will occur if the personal information of the suspect is handed over to another criminal.
01:04:46How should we respond to this age of Deepfake where everyone becomes a victim?
01:05:02No, you can't.
01:05:05You have to use Artificial Intelligence to detect Artificial Intelligence threats.
01:05:11The development of the technology to detect Deepfake is now being carried out all over the world.
01:05:21This venture company in the United States is one of the companies that is world-famous for detecting Deepfake.
01:05:41Video and images are read into the analysis software and analyzed in detail.
01:05:49It is said that the contents created by AI are characterized by color and shape.
01:05:55This is detected by an AI for analysis to determine whether it is fake or not.
01:06:01But the evolution of Deepfake is fast.
01:06:06It is said that it is highly likely that things that cannot be detected by analysis will appear in the future.
01:06:30Kyujehara! Kyujehara!
01:06:37The demand for the regulation of Deepfake is increasing all over the world.
01:06:44In Korea, in September last year,
01:06:47a bill was passed to prohibit personal viewing and possession of sexual content created by Deepfake.
01:06:56Even in the EU, it is a policy to strictly regulate the content of Deepfake in the future.
01:07:07And in Japan, the movement towards legalization is currently underway.
01:07:25Now with most of the world online, there's no confidence in anything we see.
01:07:29So I think that the advantage actually is for us.
01:07:32We live in a world where we actually believed things before, and now they're changing.
01:07:56Are you going to get arrested?
01:07:59No, it's okay. I'm just going to be interrogated for a few days.
01:08:03Don't worry.
01:08:09Thank you so much, Yoko.
01:08:15I'm counting on you.
01:08:17Yes.
01:08:19Let's go, Emi.
01:08:21See you.
01:08:25See you.
01:08:30Hey.
01:08:32Today's your birthday, right?
01:08:35Yes, that's right.
01:08:38I'll make you a cake when I get back.
01:08:55I'll make you a cake when I get back.
01:09:10Hey.
01:09:13What's wrong?
01:09:15Are you depressed again?
01:09:17I was going to say goodbye to you today.
01:09:22Huh?
01:09:24I want to face you properly.
01:09:29I'm not a father like you.
01:09:32So I want to face you with all my strength.
01:09:42What do you mean?
01:09:43If I don't live a life without Manami, neither will I or Remi.
01:10:05I'm right here.
01:10:08Remi believed me.
01:10:10Why are you saying that?
01:10:19Manami.
01:10:26You're dead.
01:10:41I'm sorry.
01:10:44I'm sorry.
01:10:48I'm really sorry.
01:11:00Goodbye, Manami.
01:11:10I'm really sorry.
01:11:14I'm really sorry.
01:11:40I'm sorry.
01:11:46Mom, hi.
01:11:49Why are you saying that?
01:11:51It's obvious.
01:11:53Remi, that's...
01:11:59It's a secret from dad.
01:12:10To be continued.
01:12:13Hey, everyone.
01:12:15I have a favor to ask you.
01:12:18Is that mail really okay?
01:12:21We need to verify the authenticity to use the service.
01:12:25We need to get your personal information.
01:12:28Be careful of the link posted in the mail.
01:12:31Stop fraud.
01:12:33Don't be fooled by us.
01:12:35I'm Mirei Kusanagi.
01:12:37Meow meow.
01:12:40Meow meow.
01:12:44Meow meow.
01:12:46Meow meow.
01:12:48Meow meow.
01:13:11NHK special, Mikan no Baton.
01:13:15What?
01:13:17How will the next 100 years be lived from the appearance of people who continue to struggle?
01:13:21All 22nd, 10pm.
01:13:28Fujii Kaze, 5th anniversary.
01:13:32Looking back at the 5 years of NHK's record.
01:13:41VTuber special, 1 on 1 presents.
01:13:46I'm the host, Hoshimatsu Seisei.
01:13:50Broadcast on March 22nd, 11pm.
01:13:56NHK, spring recommendation.
01:13:59Broadcast on April 5th, 7.30pm.
01:14:03The new series, Isejin-gu, Iseji.
01:14:07Broadcast on April 2nd, 7.57pm.
01:14:14Broadcast on April 5th, 9am.
01:14:21NHK, this spring.
01:14:26Humanience, the theme is water.
01:14:29In fact, water is a special substance.
01:14:31That's what keeps us alive.
01:14:33I didn't even think it was moving with water.
01:14:41100 years, Ikkimi TV.
01:14:44100 years of bento boxes seen in NHK's archive.
01:14:49There are the feelings of Japanese people.
01:14:52I have a lot of memories in bento boxes.

Recommended