Categorias
! Без рубрики

Judge partially lifts Trump hush money gag order

OK XID: 21553429054 Varnish cache server

Categorias
! Без рубрики

Judge partially lifts Trump hush money gag order

OK XID: 21553429054 Varnish cache server

Categorias
! Без рубрики

Deepfake porn crisis batters South Korean schools

JapanToday

Sotokanda S Bldg. 4F

5-2-1 Sotokanda

Chiyoda-ku

Tokyo 101-0021

Japan

Tel: +81 3 5829 5900

Fax: +81 3 5829 5919

Email: editor@japantoday.com

©2024

GPlusMedia Inc.

After South Korean authorities uncovered a sprawling network of AI deepfake porn Telegram chatrooms targeting schools and universities, teenage activist Bang Seo-yoon began collecting testimony of abuse from victims.Gay porno Many of the cases she documented followed the same pattern: schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms, specifically to humiliate female classmates — or even teachers. Super-wired South Korea, with the world’s fastest average internet speeds, has long battled sexual cyber violence, but experts say a toxic combination of Telegram, AI tech, and lax laws has supercharged the issue — and it is tearing through the country’s schools. “It’s not just the harm caused by the deepfake itself, but the spread of those videos among acquaintances that is even more humiliating and painful,” Bang, 18, told AFP. She has received thousands of reports from devastated victims since authorities in August found the first such Telegram chatrooms, typically set up within a school or university to prey on female students and staff. Most perpetrators are teens, police say. Deepfake prevalence is increasing exponentially globally, industry data shows, up 500 percent on year in 2023, cybersecurity startup Security Hero estimates, with 99 percent of victims women — typically famous singers and actresses. But while celebrities have powerful backers to protect them — the K-pop agency behind girlband NewJeans recently took legal action against deepfake porn — many ordinary victims are struggling to get justice, activists say. Prosecution rates are woeful: between 2021 and July this year, 793 deepfake crimes were reported but only 16 people were arrested and prosecuted, according to police data obtained by a lawmaker. After news of the chat rooms spread, complaints surged, with 118 cases reported in just five days in late August, and seven people arrested amid a police crackdown. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. The chatrooms, multiple of which AFP attempted to join before being removed by moderators, have lewd names such as “the lonely masturbator” and rules requiring members to post photos of women they wish to see “punished”. Victims find themselves “sexually insulted and mocked by their classmates in online spaces”, Kang Myeong-suk, head of victim support at the Women’s Human Rights Institute of Korea told AFP. “But the perpetrators often face no consequences,” she said, adding that victims now “live in fear of where their manipulated images might be distributed by those around them”. “Some online comments say the victims should ‘get over it’ as these deepfake images are not even real,” Kang said. “But just because manipulated images aren’t real doesn’t mean the pain the victims endure is any less genuine.” While overall crime rates in South Korea are generally low, the country has long suffered from an epidemic of spy-cam crimes, which led to major protests in 2018 inspired by the global #MeToo movement, eventually forcing lawmakers to strengthen laws. Even so “the penalties issued are often trivial, like fines or probation, which are disproportionate to the gravity of the offenses”, professor Yoon Kim Ji-young told AFP. There have also been Telegram porn scandals before, most notably in 2020 when a group blackmailing women and girls to make sexual content for paid chatrooms was uncovered. The ringleader was jailed. But things have not improved. President Yoon Suk Yeol’s dismissive views on feminism — which he has blamed for the country’s low birthrate — have signalled to men it is “okay to be hostile or discriminatory towards women”, Yoon Kim said. South Korean police blame low prosecution rates on Telegram, which is famed for its reluctance to cooperate with authorities. Its founder was recently arrested in France for failing to curb illegal content on the app. But one victim of a 2021 deepfake porn incident told AFP that this was no excuse — many victims manage to identify their attackers themselves simply by determined sleuthing. The victim, who requested anonymity, said it had been a “huge trauma” to bring her assailant to justice after she was attacked in 2021 with a barrage of Telegram messages containing deepfake images showing her being sexually assaulted. Her attacker was a fellow student at the prestigious Seoul National University, who she had rarely interacted with but always thought was “gentle”. “It was hard to accept,” she said, adding police required her to collect all the evidence herself, then she had to lobby hard for a trial, which is now ongoing. “The world I thought I knew completely collapsed,” she said in a letter she plans to submit to the court on September 26. “No one should be treated as an object or used as a means to compensate for the inferiority complexes of individuals like the defendant, simply because they are women.” The recent arrest of Pavel Durov shows that S Korea isn’t the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don’t happen. Freedom of expression is one thing. Child pornography is another. specifically to humiliate female classmates — or even teachers. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. Men/boys have always wanted to see women/girls. It is how we are built/created. Steve : I’m a bloke. You bent over, I looked. Shoot me. and Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is. and Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that’s what being a bloke is. When man invented fire, he didn’t say, “Hey, let’s cook.” He said, “Great, now we can see naked bottoms in the dark.” As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms. None of this should surprise anyone. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. Not too cool . What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. What is “sexualizing” about five fully clothed young women? Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to “punish” the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see. The famous quote of Michael Cunningham still applies “Everything Is about sex except sex. Sex is about power” What is “sexualizing” about five fully clothed young women? Hmmm….. I suppose you think that only “nudes” are the definition of “sexualization”. Also it is obvious you are oblivious to who these young women are. They are a rather famous K-Pop group called “New Jeans” and their “image” is that of the “girl next door”. You really dont understand the meaning and intent of the word “sexualization” in that context if you think it only refers to a nude or semi nude person. I hope you learned something here. schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms You can see people easily identifiable in random crowd shots at sites like this. SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used. This is pretty scary, and surely the same problems will crop up everywhere. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. Maybe time to rethink that policy about teenage perps. In his statement after release from custody in France, Pavel Durov wrote: All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster. > However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. > I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they’re discovered. I’m the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you’ll know they’re the K-pop group NewJeans that were victims of deepfake porn and took legal action against it. South Korea not the only country with a porn problem. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples.  Men/boys have always wanted to see women/girls. It is how we are built/created. Speak for yourself. Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able. As soon as Caxton invented the printing press………. And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier. I get that it isn’t socially acceptable. That isn’t being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There’s little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too. The crime happens when the images/videos are transformed, whether they are shared or not. And I have to correct this: William Caxton did not invent … Those were all quotes from a comedy TV show called “Coupling” that was popular in the early 2000s in the UK. It wasn’t any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn’t catch the reference. It seemed apropos to me. Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don’t even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites. Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology. Our laws need to be updated to reflect what society demands. That’s the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/ Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts. A treasure trove of adventures awaits in the Green Season, where the northern summer lasts longer and where there’s an activity for everyone in the family. Sponsored by Hilton Niseko Village A mix of what’s trending on our other sites

GaijinPot Blog

GaijinPot Blog

GaijinPot Travel

Savvy Tokyo

Savvy Tokyo

GaijinPot Travel

GaijinPot Events

Savvy Tokyo

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

Categorias
! Без рубрики

Deepfake porn crisis batters South Korean schools

JapanToday

Sotokanda S Bldg. 4F

5-2-1 Sotokanda

Chiyoda-ku

Tokyo 101-0021

Japan

Tel: +81 3 5829 5900

Fax: +81 3 5829 5919

Email: editor@japantoday.com

©2024

GPlusMedia Inc.

After South Korean authorities uncovered a sprawling network of AI deepfake porn Telegram chatrooms targeting schools and universities, teenage activist Bang Seo-yoon began collecting testimony of abuse from victims.Gay porno Many of the cases she documented followed the same pattern: schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms, specifically to humiliate female classmates — or even teachers. Super-wired South Korea, with the world’s fastest average internet speeds, has long battled sexual cyber violence, but experts say a toxic combination of Telegram, AI tech, and lax laws has supercharged the issue — and it is tearing through the country’s schools. “It’s not just the harm caused by the deepfake itself, but the spread of those videos among acquaintances that is even more humiliating and painful,” Bang, 18, told AFP. She has received thousands of reports from devastated victims since authorities in August found the first such Telegram chatrooms, typically set up within a school or university to prey on female students and staff. Most perpetrators are teens, police say. Deepfake prevalence is increasing exponentially globally, industry data shows, up 500 percent on year in 2023, cybersecurity startup Security Hero estimates, with 99 percent of victims women — typically famous singers and actresses. But while celebrities have powerful backers to protect them — the K-pop agency behind girlband NewJeans recently took legal action against deepfake porn — many ordinary victims are struggling to get justice, activists say. Prosecution rates are woeful: between 2021 and July this year, 793 deepfake crimes were reported but only 16 people were arrested and prosecuted, according to police data obtained by a lawmaker. After news of the chat rooms spread, complaints surged, with 118 cases reported in just five days in late August, and seven people arrested amid a police crackdown. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. The chatrooms, multiple of which AFP attempted to join before being removed by moderators, have lewd names such as “the lonely masturbator” and rules requiring members to post photos of women they wish to see “punished”. Victims find themselves “sexually insulted and mocked by their classmates in online spaces”, Kang Myeong-suk, head of victim support at the Women’s Human Rights Institute of Korea told AFP. “But the perpetrators often face no consequences,” she said, adding that victims now “live in fear of where their manipulated images might be distributed by those around them”. “Some online comments say the victims should ‘get over it’ as these deepfake images are not even real,” Kang said. “But just because manipulated images aren’t real doesn’t mean the pain the victims endure is any less genuine.” While overall crime rates in South Korea are generally low, the country has long suffered from an epidemic of spy-cam crimes, which led to major protests in 2018 inspired by the global #MeToo movement, eventually forcing lawmakers to strengthen laws. Even so “the penalties issued are often trivial, like fines or probation, which are disproportionate to the gravity of the offenses”, professor Yoon Kim Ji-young told AFP. There have also been Telegram porn scandals before, most notably in 2020 when a group blackmailing women and girls to make sexual content for paid chatrooms was uncovered. The ringleader was jailed. But things have not improved. President Yoon Suk Yeol’s dismissive views on feminism — which he has blamed for the country’s low birthrate — have signalled to men it is “okay to be hostile or discriminatory towards women”, Yoon Kim said. South Korean police blame low prosecution rates on Telegram, which is famed for its reluctance to cooperate with authorities. Its founder was recently arrested in France for failing to curb illegal content on the app. But one victim of a 2021 deepfake porn incident told AFP that this was no excuse — many victims manage to identify their attackers themselves simply by determined sleuthing. The victim, who requested anonymity, said it had been a “huge trauma” to bring her assailant to justice after she was attacked in 2021 with a barrage of Telegram messages containing deepfake images showing her being sexually assaulted. Her attacker was a fellow student at the prestigious Seoul National University, who she had rarely interacted with but always thought was “gentle”. “It was hard to accept,” she said, adding police required her to collect all the evidence herself, then she had to lobby hard for a trial, which is now ongoing. “The world I thought I knew completely collapsed,” she said in a letter she plans to submit to the court on September 26. “No one should be treated as an object or used as a means to compensate for the inferiority complexes of individuals like the defendant, simply because they are women.” The recent arrest of Pavel Durov shows that S Korea isn’t the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don’t happen. Freedom of expression is one thing. Child pornography is another. specifically to humiliate female classmates — or even teachers. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. Men/boys have always wanted to see women/girls. It is how we are built/created. Steve : I’m a bloke. You bent over, I looked. Shoot me. and Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is. and Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that’s what being a bloke is. When man invented fire, he didn’t say, “Hey, let’s cook.” He said, “Great, now we can see naked bottoms in the dark.” As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms. None of this should surprise anyone. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. Not too cool . What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. What is “sexualizing” about five fully clothed young women? Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to “punish” the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see. The famous quote of Michael Cunningham still applies “Everything Is about sex except sex. Sex is about power” What is “sexualizing” about five fully clothed young women? Hmmm….. I suppose you think that only “nudes” are the definition of “sexualization”. Also it is obvious you are oblivious to who these young women are. They are a rather famous K-Pop group called “New Jeans” and their “image” is that of the “girl next door”. You really dont understand the meaning and intent of the word “sexualization” in that context if you think it only refers to a nude or semi nude person. I hope you learned something here. schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms You can see people easily identifiable in random crowd shots at sites like this. SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used. This is pretty scary, and surely the same problems will crop up everywhere. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. Maybe time to rethink that policy about teenage perps. In his statement after release from custody in France, Pavel Durov wrote: All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster. > However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. > I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they’re discovered. I’m the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you’ll know they’re the K-pop group NewJeans that were victims of deepfake porn and took legal action against it. South Korea not the only country with a porn problem. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples.  Men/boys have always wanted to see women/girls. It is how we are built/created. Speak for yourself. Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able. As soon as Caxton invented the printing press………. And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier. I get that it isn’t socially acceptable. That isn’t being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There’s little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too. The crime happens when the images/videos are transformed, whether they are shared or not. And I have to correct this: William Caxton did not invent … Those were all quotes from a comedy TV show called “Coupling” that was popular in the early 2000s in the UK. It wasn’t any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn’t catch the reference. It seemed apropos to me. Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don’t even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites. Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology. Our laws need to be updated to reflect what society demands. That’s the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/ Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts. A treasure trove of adventures awaits in the Green Season, where the northern summer lasts longer and where there’s an activity for everyone in the family. Sponsored by Hilton Niseko Village A mix of what’s trending on our other sites

GaijinPot Blog

GaijinPot Blog

GaijinPot Travel

Savvy Tokyo

Savvy Tokyo

GaijinPot Travel

GaijinPot Events

Savvy Tokyo

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

Categorias
! Без рубрики

Deepfake porn crisis batters South Korean schools

JapanToday

Sotokanda S Bldg. 4F

5-2-1 Sotokanda

Chiyoda-ku

Tokyo 101-0021

Japan

Tel: +81 3 5829 5900

Fax: +81 3 5829 5919

Email: editor@japantoday.com

©2024

GPlusMedia Inc.

After South Korean authorities uncovered a sprawling network of AI deepfake porn Telegram chatrooms targeting schools and universities, teenage activist Bang Seo-yoon began collecting testimony of abuse from victims.Gay porno Many of the cases she documented followed the same pattern: schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms, specifically to humiliate female classmates — or even teachers. Super-wired South Korea, with the world’s fastest average internet speeds, has long battled sexual cyber violence, but experts say a toxic combination of Telegram, AI tech, and lax laws has supercharged the issue — and it is tearing through the country’s schools. “It’s not just the harm caused by the deepfake itself, but the spread of those videos among acquaintances that is even more humiliating and painful,” Bang, 18, told AFP. She has received thousands of reports from devastated victims since authorities in August found the first such Telegram chatrooms, typically set up within a school or university to prey on female students and staff. Most perpetrators are teens, police say. Deepfake prevalence is increasing exponentially globally, industry data shows, up 500 percent on year in 2023, cybersecurity startup Security Hero estimates, with 99 percent of victims women — typically famous singers and actresses. But while celebrities have powerful backers to protect them — the K-pop agency behind girlband NewJeans recently took legal action against deepfake porn — many ordinary victims are struggling to get justice, activists say. Prosecution rates are woeful: between 2021 and July this year, 793 deepfake crimes were reported but only 16 people were arrested and prosecuted, according to police data obtained by a lawmaker. After news of the chat rooms spread, complaints surged, with 118 cases reported in just five days in late August, and seven people arrested amid a police crackdown. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. The chatrooms, multiple of which AFP attempted to join before being removed by moderators, have lewd names such as “the lonely masturbator” and rules requiring members to post photos of women they wish to see “punished”. Victims find themselves “sexually insulted and mocked by their classmates in online spaces”, Kang Myeong-suk, head of victim support at the Women’s Human Rights Institute of Korea told AFP. “But the perpetrators often face no consequences,” she said, adding that victims now “live in fear of where their manipulated images might be distributed by those around them”. “Some online comments say the victims should ‘get over it’ as these deepfake images are not even real,” Kang said. “But just because manipulated images aren’t real doesn’t mean the pain the victims endure is any less genuine.” While overall crime rates in South Korea are generally low, the country has long suffered from an epidemic of spy-cam crimes, which led to major protests in 2018 inspired by the global #MeToo movement, eventually forcing lawmakers to strengthen laws. Even so “the penalties issued are often trivial, like fines or probation, which are disproportionate to the gravity of the offenses”, professor Yoon Kim Ji-young told AFP. There have also been Telegram porn scandals before, most notably in 2020 when a group blackmailing women and girls to make sexual content for paid chatrooms was uncovered. The ringleader was jailed. But things have not improved. President Yoon Suk Yeol’s dismissive views on feminism — which he has blamed for the country’s low birthrate — have signalled to men it is “okay to be hostile or discriminatory towards women”, Yoon Kim said. South Korean police blame low prosecution rates on Telegram, which is famed for its reluctance to cooperate with authorities. Its founder was recently arrested in France for failing to curb illegal content on the app. But one victim of a 2021 deepfake porn incident told AFP that this was no excuse — many victims manage to identify their attackers themselves simply by determined sleuthing. The victim, who requested anonymity, said it had been a “huge trauma” to bring her assailant to justice after she was attacked in 2021 with a barrage of Telegram messages containing deepfake images showing her being sexually assaulted. Her attacker was a fellow student at the prestigious Seoul National University, who she had rarely interacted with but always thought was “gentle”. “It was hard to accept,” she said, adding police required her to collect all the evidence herself, then she had to lobby hard for a trial, which is now ongoing. “The world I thought I knew completely collapsed,” she said in a letter she plans to submit to the court on September 26. “No one should be treated as an object or used as a means to compensate for the inferiority complexes of individuals like the defendant, simply because they are women.” The recent arrest of Pavel Durov shows that S Korea isn’t the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don’t happen. Freedom of expression is one thing. Child pornography is another. specifically to humiliate female classmates — or even teachers. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. Men/boys have always wanted to see women/girls. It is how we are built/created. Steve : I’m a bloke. You bent over, I looked. Shoot me. and Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is. and Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that’s what being a bloke is. When man invented fire, he didn’t say, “Hey, let’s cook.” He said, “Great, now we can see naked bottoms in the dark.” As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms. None of this should surprise anyone. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. Not too cool . What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. What is “sexualizing” about five fully clothed young women? Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to “punish” the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see. The famous quote of Michael Cunningham still applies “Everything Is about sex except sex. Sex is about power” What is “sexualizing” about five fully clothed young women? Hmmm….. I suppose you think that only “nudes” are the definition of “sexualization”. Also it is obvious you are oblivious to who these young women are. They are a rather famous K-Pop group called “New Jeans” and their “image” is that of the “girl next door”. You really dont understand the meaning and intent of the word “sexualization” in that context if you think it only refers to a nude or semi nude person. I hope you learned something here. schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms You can see people easily identifiable in random crowd shots at sites like this. SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used. This is pretty scary, and surely the same problems will crop up everywhere. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. Maybe time to rethink that policy about teenage perps. In his statement after release from custody in France, Pavel Durov wrote: All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster. > However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. > I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they’re discovered. I’m the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you’ll know they’re the K-pop group NewJeans that were victims of deepfake porn and took legal action against it. South Korea not the only country with a porn problem. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples.  Men/boys have always wanted to see women/girls. It is how we are built/created. Speak for yourself. Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able. As soon as Caxton invented the printing press………. And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier. I get that it isn’t socially acceptable. That isn’t being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There’s little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too. The crime happens when the images/videos are transformed, whether they are shared or not. And I have to correct this: William Caxton did not invent … Those were all quotes from a comedy TV show called “Coupling” that was popular in the early 2000s in the UK. It wasn’t any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn’t catch the reference. It seemed apropos to me. Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don’t even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites. Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology. Our laws need to be updated to reflect what society demands. That’s the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/ Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts. A treasure trove of adventures awaits in the Green Season, where the northern summer lasts longer and where there’s an activity for everyone in the family. Sponsored by Hilton Niseko Village A mix of what’s trending on our other sites

GaijinPot Blog

GaijinPot Blog

GaijinPot Travel

Savvy Tokyo

Savvy Tokyo

GaijinPot Travel

GaijinPot Events

Savvy Tokyo

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

Categorias
! Без рубрики

Deepfake porn crisis batters South Korean schools

JapanToday

Sotokanda S Bldg. 4F

5-2-1 Sotokanda

Chiyoda-ku

Tokyo 101-0021

Japan

Tel: +81 3 5829 5900

Fax: +81 3 5829 5919

Email: editor@japantoday.com

©2024

GPlusMedia Inc.

After South Korean authorities uncovered a sprawling network of AI deepfake porn Telegram chatrooms targeting schools and universities, teenage activist Bang Seo-yoon began collecting testimony of abuse from victims.Gay porno Many of the cases she documented followed the same pattern: schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms, specifically to humiliate female classmates — or even teachers. Super-wired South Korea, with the world’s fastest average internet speeds, has long battled sexual cyber violence, but experts say a toxic combination of Telegram, AI tech, and lax laws has supercharged the issue — and it is tearing through the country’s schools. “It’s not just the harm caused by the deepfake itself, but the spread of those videos among acquaintances that is even more humiliating and painful,” Bang, 18, told AFP. She has received thousands of reports from devastated victims since authorities in August found the first such Telegram chatrooms, typically set up within a school or university to prey on female students and staff. Most perpetrators are teens, police say. Deepfake prevalence is increasing exponentially globally, industry data shows, up 500 percent on year in 2023, cybersecurity startup Security Hero estimates, with 99 percent of victims women — typically famous singers and actresses. But while celebrities have powerful backers to protect them — the K-pop agency behind girlband NewJeans recently took legal action against deepfake porn — many ordinary victims are struggling to get justice, activists say. Prosecution rates are woeful: between 2021 and July this year, 793 deepfake crimes were reported but only 16 people were arrested and prosecuted, according to police data obtained by a lawmaker. After news of the chat rooms spread, complaints surged, with 118 cases reported in just five days in late August, and seven people arrested amid a police crackdown. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. The chatrooms, multiple of which AFP attempted to join before being removed by moderators, have lewd names such as “the lonely masturbator” and rules requiring members to post photos of women they wish to see “punished”. Victims find themselves “sexually insulted and mocked by their classmates in online spaces”, Kang Myeong-suk, head of victim support at the Women’s Human Rights Institute of Korea told AFP. “But the perpetrators often face no consequences,” she said, adding that victims now “live in fear of where their manipulated images might be distributed by those around them”. “Some online comments say the victims should ‘get over it’ as these deepfake images are not even real,” Kang said. “But just because manipulated images aren’t real doesn’t mean the pain the victims endure is any less genuine.” While overall crime rates in South Korea are generally low, the country has long suffered from an epidemic of spy-cam crimes, which led to major protests in 2018 inspired by the global #MeToo movement, eventually forcing lawmakers to strengthen laws. Even so “the penalties issued are often trivial, like fines or probation, which are disproportionate to the gravity of the offenses”, professor Yoon Kim Ji-young told AFP. There have also been Telegram porn scandals before, most notably in 2020 when a group blackmailing women and girls to make sexual content for paid chatrooms was uncovered. The ringleader was jailed. But things have not improved. President Yoon Suk Yeol’s dismissive views on feminism — which he has blamed for the country’s low birthrate — have signalled to men it is “okay to be hostile or discriminatory towards women”, Yoon Kim said. South Korean police blame low prosecution rates on Telegram, which is famed for its reluctance to cooperate with authorities. Its founder was recently arrested in France for failing to curb illegal content on the app. But one victim of a 2021 deepfake porn incident told AFP that this was no excuse — many victims manage to identify their attackers themselves simply by determined sleuthing. The victim, who requested anonymity, said it had been a “huge trauma” to bring her assailant to justice after she was attacked in 2021 with a barrage of Telegram messages containing deepfake images showing her being sexually assaulted. Her attacker was a fellow student at the prestigious Seoul National University, who she had rarely interacted with but always thought was “gentle”. “It was hard to accept,” she said, adding police required her to collect all the evidence herself, then she had to lobby hard for a trial, which is now ongoing. “The world I thought I knew completely collapsed,” she said in a letter she plans to submit to the court on September 26. “No one should be treated as an object or used as a means to compensate for the inferiority complexes of individuals like the defendant, simply because they are women.” The recent arrest of Pavel Durov shows that S Korea isn’t the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don’t happen. Freedom of expression is one thing. Child pornography is another. specifically to humiliate female classmates — or even teachers. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. Men/boys have always wanted to see women/girls. It is how we are built/created. Steve : I’m a bloke. You bent over, I looked. Shoot me. and Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is. and Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that’s what being a bloke is. When man invented fire, he didn’t say, “Hey, let’s cook.” He said, “Great, now we can see naked bottoms in the dark.” As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms. None of this should surprise anyone. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. Not too cool . What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. What is “sexualizing” about five fully clothed young women? Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to “punish” the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see. The famous quote of Michael Cunningham still applies “Everything Is about sex except sex. Sex is about power” What is “sexualizing” about five fully clothed young women? Hmmm….. I suppose you think that only “nudes” are the definition of “sexualization”. Also it is obvious you are oblivious to who these young women are. They are a rather famous K-Pop group called “New Jeans” and their “image” is that of the “girl next door”. You really dont understand the meaning and intent of the word “sexualization” in that context if you think it only refers to a nude or semi nude person. I hope you learned something here. schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms You can see people easily identifiable in random crowd shots at sites like this. SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used. This is pretty scary, and surely the same problems will crop up everywhere. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. Maybe time to rethink that policy about teenage perps. In his statement after release from custody in France, Pavel Durov wrote: All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster. > However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. > I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they’re discovered. I’m the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you’ll know they’re the K-pop group NewJeans that were victims of deepfake porn and took legal action against it. South Korea not the only country with a porn problem. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples.  Men/boys have always wanted to see women/girls. It is how we are built/created. Speak for yourself. Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able. As soon as Caxton invented the printing press………. And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier. I get that it isn’t socially acceptable. That isn’t being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There’s little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too. The crime happens when the images/videos are transformed, whether they are shared or not. And I have to correct this: William Caxton did not invent … Those were all quotes from a comedy TV show called “Coupling” that was popular in the early 2000s in the UK. It wasn’t any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn’t catch the reference. It seemed apropos to me. Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don’t even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites. Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology. Our laws need to be updated to reflect what society demands. That’s the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/ Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts. A treasure trove of adventures awaits in the Green Season, where the northern summer lasts longer and where there’s an activity for everyone in the family. Sponsored by Hilton Niseko Village A mix of what’s trending on our other sites

GaijinPot Blog

GaijinPot Blog

GaijinPot Travel

Savvy Tokyo

Savvy Tokyo

GaijinPot Travel

GaijinPot Events

Savvy Tokyo

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

Categorias
! Без рубрики

Deepfake porn crisis batters South Korean schools

JapanToday

Sotokanda S Bldg. 4F

5-2-1 Sotokanda

Chiyoda-ku

Tokyo 101-0021

Japan

Tel: +81 3 5829 5900

Fax: +81 3 5829 5919

Email: editor@japantoday.com

©2024

GPlusMedia Inc.

After South Korean authorities uncovered a sprawling network of AI deepfake porn Telegram chatrooms targeting schools and universities, teenage activist Bang Seo-yoon began collecting testimony of abuse from victims.Gay porno Many of the cases she documented followed the same pattern: schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms, specifically to humiliate female classmates — or even teachers. Super-wired South Korea, with the world’s fastest average internet speeds, has long battled sexual cyber violence, but experts say a toxic combination of Telegram, AI tech, and lax laws has supercharged the issue — and it is tearing through the country’s schools. “It’s not just the harm caused by the deepfake itself, but the spread of those videos among acquaintances that is even more humiliating and painful,” Bang, 18, told AFP. She has received thousands of reports from devastated victims since authorities in August found the first such Telegram chatrooms, typically set up within a school or university to prey on female students and staff. Most perpetrators are teens, police say. Deepfake prevalence is increasing exponentially globally, industry data shows, up 500 percent on year in 2023, cybersecurity startup Security Hero estimates, with 99 percent of victims women — typically famous singers and actresses. But while celebrities have powerful backers to protect them — the K-pop agency behind girlband NewJeans recently took legal action against deepfake porn — many ordinary victims are struggling to get justice, activists say. Prosecution rates are woeful: between 2021 and July this year, 793 deepfake crimes were reported but only 16 people were arrested and prosecuted, according to police data obtained by a lawmaker. After news of the chat rooms spread, complaints surged, with 118 cases reported in just five days in late August, and seven people arrested amid a police crackdown. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. The chatrooms, multiple of which AFP attempted to join before being removed by moderators, have lewd names such as “the lonely masturbator” and rules requiring members to post photos of women they wish to see “punished”. Victims find themselves “sexually insulted and mocked by their classmates in online spaces”, Kang Myeong-suk, head of victim support at the Women’s Human Rights Institute of Korea told AFP. “But the perpetrators often face no consequences,” she said, adding that victims now “live in fear of where their manipulated images might be distributed by those around them”. “Some online comments say the victims should ‘get over it’ as these deepfake images are not even real,” Kang said. “But just because manipulated images aren’t real doesn’t mean the pain the victims endure is any less genuine.” While overall crime rates in South Korea are generally low, the country has long suffered from an epidemic of spy-cam crimes, which led to major protests in 2018 inspired by the global #MeToo movement, eventually forcing lawmakers to strengthen laws. Even so “the penalties issued are often trivial, like fines or probation, which are disproportionate to the gravity of the offenses”, professor Yoon Kim Ji-young told AFP. There have also been Telegram porn scandals before, most notably in 2020 when a group blackmailing women and girls to make sexual content for paid chatrooms was uncovered. The ringleader was jailed. But things have not improved. President Yoon Suk Yeol’s dismissive views on feminism — which he has blamed for the country’s low birthrate — have signalled to men it is “okay to be hostile or discriminatory towards women”, Yoon Kim said. South Korean police blame low prosecution rates on Telegram, which is famed for its reluctance to cooperate with authorities. Its founder was recently arrested in France for failing to curb illegal content on the app. But one victim of a 2021 deepfake porn incident told AFP that this was no excuse — many victims manage to identify their attackers themselves simply by determined sleuthing. The victim, who requested anonymity, said it had been a “huge trauma” to bring her assailant to justice after she was attacked in 2021 with a barrage of Telegram messages containing deepfake images showing her being sexually assaulted. Her attacker was a fellow student at the prestigious Seoul National University, who she had rarely interacted with but always thought was “gentle”. “It was hard to accept,” she said, adding police required her to collect all the evidence herself, then she had to lobby hard for a trial, which is now ongoing. “The world I thought I knew completely collapsed,” she said in a letter she plans to submit to the court on September 26. “No one should be treated as an object or used as a means to compensate for the inferiority complexes of individuals like the defendant, simply because they are women.” The recent arrest of Pavel Durov shows that S Korea isn’t the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don’t happen. Freedom of expression is one thing. Child pornography is another. specifically to humiliate female classmates — or even teachers. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. Men/boys have always wanted to see women/girls. It is how we are built/created. Steve : I’m a bloke. You bent over, I looked. Shoot me. and Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is. and Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that’s what being a bloke is. When man invented fire, he didn’t say, “Hey, let’s cook.” He said, “Great, now we can see naked bottoms in the dark.” As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms. None of this should surprise anyone. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. Not too cool . What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. What is “sexualizing” about five fully clothed young women? Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to “punish” the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see. The famous quote of Michael Cunningham still applies “Everything Is about sex except sex. Sex is about power” What is “sexualizing” about five fully clothed young women? Hmmm….. I suppose you think that only “nudes” are the definition of “sexualization”. Also it is obvious you are oblivious to who these young women are. They are a rather famous K-Pop group called “New Jeans” and their “image” is that of the “girl next door”. You really dont understand the meaning and intent of the word “sexualization” in that context if you think it only refers to a nude or semi nude person. I hope you learned something here. schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms You can see people easily identifiable in random crowd shots at sites like this. SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used. This is pretty scary, and surely the same problems will crop up everywhere. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. Maybe time to rethink that policy about teenage perps. In his statement after release from custody in France, Pavel Durov wrote: All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster. > However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. > I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they’re discovered. I’m the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you’ll know they’re the K-pop group NewJeans that were victims of deepfake porn and took legal action against it. South Korea not the only country with a porn problem. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples.  Men/boys have always wanted to see women/girls. It is how we are built/created. Speak for yourself. Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able. As soon as Caxton invented the printing press………. And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier. I get that it isn’t socially acceptable. That isn’t being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There’s little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too. The crime happens when the images/videos are transformed, whether they are shared or not. And I have to correct this: William Caxton did not invent … Those were all quotes from a comedy TV show called “Coupling” that was popular in the early 2000s in the UK. It wasn’t any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn’t catch the reference. It seemed apropos to me. Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don’t even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites. Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology. Our laws need to be updated to reflect what society demands. That’s the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/ Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts. A treasure trove of adventures awaits in the Green Season, where the northern summer lasts longer and where there’s an activity for everyone in the family. Sponsored by Hilton Niseko Village A mix of what’s trending on our other sites

GaijinPot Blog

GaijinPot Blog

GaijinPot Travel

Savvy Tokyo

Savvy Tokyo

GaijinPot Travel

GaijinPot Events

Savvy Tokyo

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

Categorias
! Без рубрики

Deepfake porn crisis batters South Korean schools

JapanToday

Sotokanda S Bldg. 4F

5-2-1 Sotokanda

Chiyoda-ku

Tokyo 101-0021

Japan

Tel: +81 3 5829 5900

Fax: +81 3 5829 5919

Email: editor@japantoday.com

©2024

GPlusMedia Inc.

After South Korean authorities uncovered a sprawling network of AI deepfake porn Telegram chatrooms targeting schools and universities, teenage activist Bang Seo-yoon began collecting testimony of abuse from victims.Bonsai Casino Many of the cases she documented followed the same pattern: schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms, specifically to humiliate female classmates — or even teachers. Super-wired South Korea, with the world’s fastest average internet speeds, has long battled sexual cyber violence, but experts say a toxic combination of Telegram, AI tech, and lax laws has supercharged the issue — and it is tearing through the country’s schools. “It’s not just the harm caused by the deepfake itself, but the spread of those videos among acquaintances that is even more humiliating and painful,” Bang, 18, told AFP. She has received thousands of reports from devastated victims since authorities in August found the first such Telegram chatrooms, typically set up within a school or university to prey on female students and staff. Most perpetrators are teens, police say. Deepfake prevalence is increasing exponentially globally, industry data shows, up 500 percent on year in 2023, cybersecurity startup Security Hero estimates, with 99 percent of victims women — typically famous singers and actresses. But while celebrities have powerful backers to protect them — the K-pop agency behind girlband NewJeans recently took legal action against deepfake porn — many ordinary victims are struggling to get justice, activists say. Prosecution rates are woeful: between 2021 and July this year, 793 deepfake crimes were reported but only 16 people were arrested and prosecuted, according to police data obtained by a lawmaker. After news of the chat rooms spread, complaints surged, with 118 cases reported in just five days in late August, and seven people arrested amid a police crackdown. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. The chatrooms, multiple of which AFP attempted to join before being removed by moderators, have lewd names such as “the lonely masturbator” and rules requiring members to post photos of women they wish to see “punished”. Victims find themselves “sexually insulted and mocked by their classmates in online spaces”, Kang Myeong-suk, head of victim support at the Women’s Human Rights Institute of Korea told AFP. “But the perpetrators often face no consequences,” she said, adding that victims now “live in fear of where their manipulated images might be distributed by those around them”. “Some online comments say the victims should ‘get over it’ as these deepfake images are not even real,” Kang said. “But just because manipulated images aren’t real doesn’t mean the pain the victims endure is any less genuine.” While overall crime rates in South Korea are generally low, the country has long suffered from an epidemic of spy-cam crimes, which led to major protests in 2018 inspired by the global #MeToo movement, eventually forcing lawmakers to strengthen laws. Even so “the penalties issued are often trivial, like fines or probation, which are disproportionate to the gravity of the offenses”, professor Yoon Kim Ji-young told AFP. There have also been Telegram porn scandals before, most notably in 2020 when a group blackmailing women and girls to make sexual content for paid chatrooms was uncovered. The ringleader was jailed. But things have not improved. President Yoon Suk Yeol’s dismissive views on feminism — which he has blamed for the country’s low birthrate — have signalled to men it is “okay to be hostile or discriminatory towards women”, Yoon Kim said. South Korean police blame low prosecution rates on Telegram, which is famed for its reluctance to cooperate with authorities. Its founder was recently arrested in France for failing to curb illegal content on the app. But one victim of a 2021 deepfake porn incident told AFP that this was no excuse — many victims manage to identify their attackers themselves simply by determined sleuthing. The victim, who requested anonymity, said it had been a “huge trauma” to bring her assailant to justice after she was attacked in 2021 with a barrage of Telegram messages containing deepfake images showing her being sexually assaulted. Her attacker was a fellow student at the prestigious Seoul National University, who she had rarely interacted with but always thought was “gentle”. “It was hard to accept,” she said, adding police required her to collect all the evidence herself, then she had to lobby hard for a trial, which is now ongoing. “The world I thought I knew completely collapsed,” she said in a letter she plans to submit to the court on September 26. “No one should be treated as an object or used as a means to compensate for the inferiority complexes of individuals like the defendant, simply because they are women.” The recent arrest of Pavel Durov shows that S Korea isn’t the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don’t happen. Freedom of expression is one thing. Child pornography is another. specifically to humiliate female classmates — or even teachers. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. Men/boys have always wanted to see women/girls. It is how we are built/created. Steve : I’m a bloke. You bent over, I looked. Shoot me. and Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is. and Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that’s what being a bloke is. When man invented fire, he didn’t say, “Hey, let’s cook.” He said, “Great, now we can see naked bottoms in the dark.” As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms. None of this should surprise anyone. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. Not too cool . What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. What is “sexualizing” about five fully clothed young women? Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to “punish” the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see. The famous quote of Michael Cunningham still applies “Everything Is about sex except sex. Sex is about power” What is “sexualizing” about five fully clothed young women? Hmmm….. I suppose you think that only “nudes” are the definition of “sexualization”. Also it is obvious you are oblivious to who these young women are. They are a rather famous K-Pop group called “New Jeans” and their “image” is that of the “girl next door”. You really dont understand the meaning and intent of the word “sexualization” in that context if you think it only refers to a nude or semi nude person. I hope you learned something here. schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms You can see people easily identifiable in random crowd shots at sites like this. SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used. This is pretty scary, and surely the same problems will crop up everywhere. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. Maybe time to rethink that policy about teenage perps. In his statement after release from custody in France, Pavel Durov wrote: All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster. > However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. > I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they’re discovered. I’m the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you’ll know they’re the K-pop group NewJeans that were victims of deepfake porn and took legal action against it. South Korea not the only country with a porn problem. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples.  Men/boys have always wanted to see women/girls. It is how we are built/created. Speak for yourself. Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able. As soon as Caxton invented the printing press………. And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier. I get that it isn’t socially acceptable. That isn’t being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There’s little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too. The crime happens when the images/videos are transformed, whether they are shared or not. And I have to correct this: William Caxton did not invent … Those were all quotes from a comedy TV show called “Coupling” that was popular in the early 2000s in the UK. It wasn’t any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn’t catch the reference. It seemed apropos to me. Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don’t even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites. Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology. Our laws need to be updated to reflect what society demands. That’s the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/ Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts. A treasure trove of adventures awaits in the Green Season, where the northern summer lasts longer and where there’s an activity for everyone in the family. Sponsored by Hilton Niseko Village A mix of what’s trending on our other sites

GaijinPot Blog

GaijinPot Blog

GaijinPot Travel

Savvy Tokyo

Savvy Tokyo

GaijinPot Travel

GaijinPot Events

Savvy Tokyo

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

Categorias
! Без рубрики

Deepfake porn crisis batters South Korean schools

JapanToday

Sotokanda S Bldg. 4F

5-2-1 Sotokanda

Chiyoda-ku

Tokyo 101-0021

Japan

Tel: +81 3 5829 5900

Fax: +81 3 5829 5919

Email: editor@japantoday.com

©2024

GPlusMedia Inc.

After South Korean authorities uncovered a sprawling network of AI deepfake porn Telegram chatrooms targeting schools and universities, teenage activist Bang Seo-yoon began collecting testimony of abuse from victims.Gay porno Many of the cases she documented followed the same pattern: schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms, specifically to humiliate female classmates — or even teachers. Super-wired South Korea, with the world’s fastest average internet speeds, has long battled sexual cyber violence, but experts say a toxic combination of Telegram, AI tech, and lax laws has supercharged the issue — and it is tearing through the country’s schools. “It’s not just the harm caused by the deepfake itself, but the spread of those videos among acquaintances that is even more humiliating and painful,” Bang, 18, told AFP. She has received thousands of reports from devastated victims since authorities in August found the first such Telegram chatrooms, typically set up within a school or university to prey on female students and staff. Most perpetrators are teens, police say. Deepfake prevalence is increasing exponentially globally, industry data shows, up 500 percent on year in 2023, cybersecurity startup Security Hero estimates, with 99 percent of victims women — typically famous singers and actresses. But while celebrities have powerful backers to protect them — the K-pop agency behind girlband NewJeans recently took legal action against deepfake porn — many ordinary victims are struggling to get justice, activists say. Prosecution rates are woeful: between 2021 and July this year, 793 deepfake crimes were reported but only 16 people were arrested and prosecuted, according to police data obtained by a lawmaker. After news of the chat rooms spread, complaints surged, with 118 cases reported in just five days in late August, and seven people arrested amid a police crackdown. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. The chatrooms, multiple of which AFP attempted to join before being removed by moderators, have lewd names such as “the lonely masturbator” and rules requiring members to post photos of women they wish to see “punished”. Victims find themselves “sexually insulted and mocked by their classmates in online spaces”, Kang Myeong-suk, head of victim support at the Women’s Human Rights Institute of Korea told AFP. “But the perpetrators often face no consequences,” she said, adding that victims now “live in fear of where their manipulated images might be distributed by those around them”. “Some online comments say the victims should ‘get over it’ as these deepfake images are not even real,” Kang said. “But just because manipulated images aren’t real doesn’t mean the pain the victims endure is any less genuine.” While overall crime rates in South Korea are generally low, the country has long suffered from an epidemic of spy-cam crimes, which led to major protests in 2018 inspired by the global #MeToo movement, eventually forcing lawmakers to strengthen laws. Even so “the penalties issued are often trivial, like fines or probation, which are disproportionate to the gravity of the offenses”, professor Yoon Kim Ji-young told AFP. There have also been Telegram porn scandals before, most notably in 2020 when a group blackmailing women and girls to make sexual content for paid chatrooms was uncovered. The ringleader was jailed. But things have not improved. President Yoon Suk Yeol’s dismissive views on feminism — which he has blamed for the country’s low birthrate — have signalled to men it is “okay to be hostile or discriminatory towards women”, Yoon Kim said. South Korean police blame low prosecution rates on Telegram, which is famed for its reluctance to cooperate with authorities. Its founder was recently arrested in France for failing to curb illegal content on the app. But one victim of a 2021 deepfake porn incident told AFP that this was no excuse — many victims manage to identify their attackers themselves simply by determined sleuthing. The victim, who requested anonymity, said it had been a “huge trauma” to bring her assailant to justice after she was attacked in 2021 with a barrage of Telegram messages containing deepfake images showing her being sexually assaulted. Her attacker was a fellow student at the prestigious Seoul National University, who she had rarely interacted with but always thought was “gentle”. “It was hard to accept,” she said, adding police required her to collect all the evidence herself, then she had to lobby hard for a trial, which is now ongoing. “The world I thought I knew completely collapsed,” she said in a letter she plans to submit to the court on September 26. “No one should be treated as an object or used as a means to compensate for the inferiority complexes of individuals like the defendant, simply because they are women.” The recent arrest of Pavel Durov shows that S Korea isn’t the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don’t happen. Freedom of expression is one thing. Child pornography is another. specifically to humiliate female classmates — or even teachers. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. Men/boys have always wanted to see women/girls. It is how we are built/created. Steve : I’m a bloke. You bent over, I looked. Shoot me. and Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is. and Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that’s what being a bloke is. When man invented fire, he didn’t say, “Hey, let’s cook.” He said, “Great, now we can see naked bottoms in the dark.” As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms. None of this should surprise anyone. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. Not too cool . What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. What is “sexualizing” about five fully clothed young women? Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to “punish” the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see. The famous quote of Michael Cunningham still applies “Everything Is about sex except sex. Sex is about power” What is “sexualizing” about five fully clothed young women? Hmmm….. I suppose you think that only “nudes” are the definition of “sexualization”. Also it is obvious you are oblivious to who these young women are. They are a rather famous K-Pop group called “New Jeans” and their “image” is that of the “girl next door”. You really dont understand the meaning and intent of the word “sexualization” in that context if you think it only refers to a nude or semi nude person. I hope you learned something here. schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms You can see people easily identifiable in random crowd shots at sites like this. SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used. This is pretty scary, and surely the same problems will crop up everywhere. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. Maybe time to rethink that policy about teenage perps. In his statement after release from custody in France, Pavel Durov wrote: All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster. > However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. > I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they’re discovered. I’m the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you’ll know they’re the K-pop group NewJeans that were victims of deepfake porn and took legal action against it. South Korea not the only country with a porn problem. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples.  Men/boys have always wanted to see women/girls. It is how we are built/created. Speak for yourself. Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able. As soon as Caxton invented the printing press………. And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier. I get that it isn’t socially acceptable. That isn’t being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There’s little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too. The crime happens when the images/videos are transformed, whether they are shared or not. And I have to correct this: William Caxton did not invent … Those were all quotes from a comedy TV show called “Coupling” that was popular in the early 2000s in the UK. It wasn’t any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn’t catch the reference. It seemed apropos to me. Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don’t even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites. Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology. Our laws need to be updated to reflect what society demands. That’s the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/ Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts. A treasure trove of adventures awaits in the Green Season, where the northern summer lasts longer and where there’s an activity for everyone in the family. Sponsored by Hilton Niseko Village A mix of what’s trending on our other sites

GaijinPot Blog

GaijinPot Blog

GaijinPot Travel

Savvy Tokyo

Savvy Tokyo

GaijinPot Travel

GaijinPot Events

Savvy Tokyo

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

Categorias
! Без рубрики

Deepfake porn crisis batters South Korean schools

JapanToday

Sotokanda S Bldg. 4F

5-2-1 Sotokanda

Chiyoda-ku

Tokyo 101-0021

Japan

Tel: +81 3 5829 5900

Fax: +81 3 5829 5919

Email: editor@japantoday.com

©2024

GPlusMedia Inc.

After South Korean authorities uncovered a sprawling network of AI deepfake porn Telegram chatrooms targeting schools and universities, teenage activist Bang Seo-yoon began collecting testimony of abuse from victims.Gay porno Many of the cases she documented followed the same pattern: schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms, specifically to humiliate female classmates — or even teachers. Super-wired South Korea, with the world’s fastest average internet speeds, has long battled sexual cyber violence, but experts say a toxic combination of Telegram, AI tech, and lax laws has supercharged the issue — and it is tearing through the country’s schools. “It’s not just the harm caused by the deepfake itself, but the spread of those videos among acquaintances that is even more humiliating and painful,” Bang, 18, told AFP. She has received thousands of reports from devastated victims since authorities in August found the first such Telegram chatrooms, typically set up within a school or university to prey on female students and staff. Most perpetrators are teens, police say. Deepfake prevalence is increasing exponentially globally, industry data shows, up 500 percent on year in 2023, cybersecurity startup Security Hero estimates, with 99 percent of victims women — typically famous singers and actresses. But while celebrities have powerful backers to protect them — the K-pop agency behind girlband NewJeans recently took legal action against deepfake porn — many ordinary victims are struggling to get justice, activists say. Prosecution rates are woeful: between 2021 and July this year, 793 deepfake crimes were reported but only 16 people were arrested and prosecuted, according to police data obtained by a lawmaker. After news of the chat rooms spread, complaints surged, with 118 cases reported in just five days in late August, and seven people arrested amid a police crackdown. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. The chatrooms, multiple of which AFP attempted to join before being removed by moderators, have lewd names such as “the lonely masturbator” and rules requiring members to post photos of women they wish to see “punished”. Victims find themselves “sexually insulted and mocked by their classmates in online spaces”, Kang Myeong-suk, head of victim support at the Women’s Human Rights Institute of Korea told AFP. “But the perpetrators often face no consequences,” she said, adding that victims now “live in fear of where their manipulated images might be distributed by those around them”. “Some online comments say the victims should ‘get over it’ as these deepfake images are not even real,” Kang said. “But just because manipulated images aren’t real doesn’t mean the pain the victims endure is any less genuine.” While overall crime rates in South Korea are generally low, the country has long suffered from an epidemic of spy-cam crimes, which led to major protests in 2018 inspired by the global #MeToo movement, eventually forcing lawmakers to strengthen laws. Even so “the penalties issued are often trivial, like fines or probation, which are disproportionate to the gravity of the offenses”, professor Yoon Kim Ji-young told AFP. There have also been Telegram porn scandals before, most notably in 2020 when a group blackmailing women and girls to make sexual content for paid chatrooms was uncovered. The ringleader was jailed. But things have not improved. President Yoon Suk Yeol’s dismissive views on feminism — which he has blamed for the country’s low birthrate — have signalled to men it is “okay to be hostile or discriminatory towards women”, Yoon Kim said. South Korean police blame low prosecution rates on Telegram, which is famed for its reluctance to cooperate with authorities. Its founder was recently arrested in France for failing to curb illegal content on the app. But one victim of a 2021 deepfake porn incident told AFP that this was no excuse — many victims manage to identify their attackers themselves simply by determined sleuthing. The victim, who requested anonymity, said it had been a “huge trauma” to bring her assailant to justice after she was attacked in 2021 with a barrage of Telegram messages containing deepfake images showing her being sexually assaulted. Her attacker was a fellow student at the prestigious Seoul National University, who she had rarely interacted with but always thought was “gentle”. “It was hard to accept,” she said, adding police required her to collect all the evidence herself, then she had to lobby hard for a trial, which is now ongoing. “The world I thought I knew completely collapsed,” she said in a letter she plans to submit to the court on September 26. “No one should be treated as an object or used as a means to compensate for the inferiority complexes of individuals like the defendant, simply because they are women.” The recent arrest of Pavel Durov shows that S Korea isn’t the only country struggling with Telegram-related issues. The app really needs to step up its act so that things like this don’t happen. Freedom of expression is one thing. Child pornography is another. specifically to humiliate female classmates — or even teachers. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. Men/boys have always wanted to see women/girls. It is how we are built/created. Steve : I’m a bloke. You bent over, I looked. Shoot me. and Steve : It is the four pillars of the male heterosexual psyche. We like: naked women, stockings, lesbians, and Sean Connery best as James Bond, because that is what being a boy is. and Steve : that does not stop me wanting to see several thousand more naked bottoms before I die, because that’s what being a bloke is. When man invented fire, he didn’t say, “Hey, let’s cook.” He said, “Great, now we can see naked bottoms in the dark.” As soon as Caxton invented the printing press, we were using it to make pictures of, hey, naked bottoms! We have turned the Internet into an enormous international database of naked bottoms. So you see, the story of male achievement through the ages, feeble though it may have been, has been the story of our struggle to get a better look at your bottoms. None of this should surprise anyone. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. Not too cool . What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. What is “sexualizing” about five fully clothed young women? Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples. There are countless ways for people to see nudes, if this was only about this there would be no need to deepfake it. These cases are about faking nudes of specific people with the purpose of humiliating them as well, the article clearly describes how the spaces are said to be used to “punish” the victims, instead of private enjoyment that nobody could prove the criminals are sharing the fakes for everybody to see. The famous quote of Michael Cunningham still applies “Everything Is about sex except sex. Sex is about power” What is “sexualizing” about five fully clothed young women? Hmmm….. I suppose you think that only “nudes” are the definition of “sexualization”. Also it is obvious you are oblivious to who these young women are. They are a rather famous K-Pop group called “New Jeans” and their “image” is that of the “girl next door”. You really dont understand the meaning and intent of the word “sexualization” in that context if you think it only refers to a nude or semi nude person. I hope you learned something here. schoolboys steal innocuous selfies from private Instagram accounts and create explicit images to share in the chat rooms You can see people easily identifiable in random crowd shots at sites like this. SK is the canary in the coal mine with its hyper digitization but this is a worldwide situation. A digital Bill of rights is needed In this late stage capitalism society the only way people will be protected is if it is prohibitively expensive to use their personal data like images and writing. And people are well rewarded if it is used. This is pretty scary, and surely the same problems will crop up everywhere. But six out of seven alleged perpetrators were teenagers, police say, which complicates prosecutions as South Korean courts rarely issue arrest warrants for minors. Maybe time to rethink that policy about teenage perps. In his statement after release from custody in France, Pavel Durov wrote: All of that does not mean Telegram is perfect. Even the fact that authorities could be confused by where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports. We have direct hotlines with NGOs to process urgent moderation requests faster. > However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon. > I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. There should be a way for Korean authorities to contact Telegram to have the content taken down. That would be the logical way to tackle the problem. Have a hotline set up for these channels to be taken down as soon as they’re discovered. I’m the government of Korea would be able to negotiate such a hotline, and set up a special police section to monitor Telegram for these obscene channels. What is the point of the picture of the young women at the bottom? Further sexualizing an already difficult social issue belittles the point attempting to be made. The picture is not irrelevant or meant to sexualize the issue. If you read the caption on it, you’ll know they’re the K-pop group NewJeans that were victims of deepfake porn and took legal action against it. South Korea not the only country with a porn problem. Doubtful. They just want to see girl-bits, like most boys/men do. Approximated or real, doesn’t matter. Just look at the upskirt photo problem in Japan for examples.  Men/boys have always wanted to see women/girls. It is how we are built/created. Speak for yourself. Hope this deepfake stuff doesn’t take hold in Japanese schools. Sadly, with J-children being far more naive than most, I would hazard to guess that this type of thing would be more believe-able. As soon as Caxton invented the printing press………. And I have to correct this: William Caxton did not invent the printing press. Thought everyone knew it was Johannes Gutenberg, many years earlier. I get that it isn’t socially acceptable. That isn’t being disputed. Countries are still trying to decide if AI generated images are porn or not. Are AI generated images of child pornography illegal or not? Hopefully, that will be determined as true. Also, converting a harmless image/video into porn without a written release by the subject should also be illegal. There’s little need to say any of that. The only people who will disagree are the people creating the images/videos. The laws need to catch up with technology, wild not being so restrictive that freedoms to take photos in public of people and things in public are allowed too. The crime happens when the images/videos are transformed, whether they are shared or not. And I have to correct this: William Caxton did not invent … Those were all quotes from a comedy TV show called “Coupling” that was popular in the early 2000s in the UK. It wasn’t any statement of fact. Rather it was a verbatim rant from a TV character. Sorry that you didn’t catch the reference. It seemed apropos to me. Good deepfakes have been around about a decade. It is only in the last 3 yrs that commercialization and very easy to use websites have existed to convert normal images with a description into socially unacceptable images by unsophisticated users. There are apps for cell phones to upload and transform images. Just a checkbox that says you have permission from the owner of the photo to use it is required. You don’t even need a login on many of those sites for initial trials that make low resolution images. Low resolution is fine for phone viewing or websites. Posting them on the web, even with encrypted groups, has been possible for over 40 yrs. Remember usenet? Again, it is just the technology that has changed. Usenet was a key part of the internet for many decades. It had great and terrible uses, just like all technology. Our laws need to be updated to reflect what society demands. That’s the point. https://www.washingtonpost.com/technology/interactive/2024/ai-bias-beautiful-women-ugly-images/ Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts. A treasure trove of adventures awaits in the Green Season, where the northern summer lasts longer and where there’s an activity for everyone in the family. Sponsored by Hilton Niseko Village A mix of what’s trending on our other sites

GaijinPot Blog

GaijinPot Blog

GaijinPot Travel

Savvy Tokyo

Savvy Tokyo

GaijinPot Travel

GaijinPot Events

Savvy Tokyo

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

GaijinPot Blog

desculpe!!

sorry

Desculpe, ainda estamos em manutenção! 
Em breve teremos muitos conteúdos para você!
Enquanto isso, se precisar de ajuda pode entrar em contato com a gente, será um prazer te atender!