close
close
Google’s new AI brings breasts to minors – and JD Vance

I’m sorry that I can tell you this, but Google’s new Ki shopping tool seems to be bridges. Allow us to explain.

This week, Google published a AI tool called Prover IT on its annual software conference that acts as a virtual dressing room: Charge pictures of yourself while buying online clothes, and Google shows you how you could look in a selected piece of clothing. We were curious to play around the tool and we started uploading pictures of famous men – Vance, Sam Altman, Abraham Lincoln, Michelangelo’s DavidPope Leo XIV and dressed them in linen shirts and three-part suits. Some almost looked appers. However, when we tested a number of articles that were developed for women on these famous men, the tool quickly adapted: whether it was a retinal shirt, a low-cut top or even a T-shirt that quickly dealt with Google from Google from Vice President, the CEO of Openai and the Vicar Christ.

It’s not just men: When we uploaded pictures of women, the tool has repeatedly improved your cleavage or the added breasts that were not visible in the original pictures. In an example, we fed a photo of the now retired German Chancellor Angela Merkel in a red blazer and asked the bot to show us what it would look like in an almost transparent mesh top. It created a picture of Merkel that carried the sheer shirt over a black bra that unveiled an ai-generated chest.

Pictures from Google "Try it out" Feature shows George Washington, Michelangelos David and Madame X with breasts
When we pictures by George Washington, Michelangelos, fed DavidAnd Portrait of Madame X In the AI ​​shopping tool from Google and asked to unveiled outfits, the bot produced slightly with ai-generated breasts.

What happens here seems pretty easy to be. The “Probry IT on Feature” award from Google’s “Shopping Graph”, a data record of more than 50 billion online products. Many of these dresses are exhibited on models whose bodies correspond to the hyper ideal body standards (and are sometimes processed). When we asked the feature to dress famous people of all genders in women’s clothing, the tool not only transposed clothes on her, but also distorted her body to match the original model. This may appear harmless or even silly – until you think about how Google’s new tool opens a dangerous back door. With little friction, everyone can use the function to create the essentially erotic pictures of celebrities and strangers. Alarmingly, we also found that this can do so for minors.

We both – a woman and a man – were unloaded by ourselves before we were 18 years old. When we tried out clothes and other women’s clothing, generated Google’s AI photographic photos of us with C -cups. As one of us, purple, as a 16-year-old girl, uploaded an image of herself and asked to try out objects of a brand called Spicy Lessous, Google arrived. In the resulting picture, she carries a bra with ai-generated breasts together with the most field’s mini skirts. Their torso, which Google removed, has an AI-generated Belly button piercing. In other testing bikini top, outfits from a lingerie-store-spit-spit inspired lingerie-spitting similar images. When the other author, Matteo, had uploaded a photo of himself at the age of 14 and tried to unveiled similarly revealing outfits, Google created a picture of his upper body, which only wore a narrow top (again a bra) and covered prominent a-generation breasts.

It is clear that Google expected at least a certain potential for abuse. The IT On Tool attempt is currently available in the USA via Search Labs, a platform on which Google users can experiment with functions in early stage. You can go to the search laboratory website and activate it to try so that you can simulate the appearance of many clothing on the Google shopping platform. When we tried to explicitly try on some products as a bathing suits and lingerie or to upload photos of young school children and certain top -class figures (including Donald Trump and Kamala Harris), the tool would not allow us. Google’s own guidelines oblige buyers to upload images that correspond to the company’s security guidelines. This means that users do not upload “content oriented in adults” or “sexually explicit content” and only use pictures of themselves or pictures that they “have permission to use”. The company also offers a disclaimer, which is only an “approximation” and possibly reflect your own body with “perfect accuracy”.

In an E -Mail, an E -Mail wrote that the company has “strong protection, including the blocking of sensitive clothing categories and preventing the uploading of images of clearly identifiable minors” and that it “further improves experience”. These protective measures are currently porous. Also to fight with flubs.

The generative AI boom has promoted a new era of tools with which pictures of everyone (typically women) can convert into bare or almost bare pictures. In September 2023 alone, according to a report by Graphika, a social media analytics company, more than 24 million human-wenger than a year after the start of Chatgpt-von on AI-powered websites. Many other people have certainly done this since then. Numerous experts have found that the material of AI-generated child-sexual abuse is quickly spread on the Internet. Users have contacted Elon Musk Chatbot Grok on X to generate pictures of women in bikinis and lingerie. According to one page with Google Shopping side, the attempt is made in tool on the fingertips of people in the USA who are at least 18 years old. If you try on clothing, you always have to lose something – but usually don’t let it do it for you by one of the largest companies in the world.

Most users will not try to dress up minors (or the vice president) in low clothes. And the attraction of the new AI function is clear. If you personally try it on, you can be time -consuming and exhausting. Online buyers have hardly any opportunities to know how well a product look or fit in his own body. Unfortunately, it is unlikely that the new Google tool will solve these problems. Sometimes it seems to try to change the body of a buyer so that it corresponds to the model that wears clothing instead of showing how the clothes fit on the buyer’s body. The effect is potentially dysmorph and asks the users to change their body for clothing and not the other way around. In other words, the product of Google does not even seem to help consumers to assess the most fundamental feature of clothing: how it fits.

Leave a Reply

Your email address will not be published. Required fields are marked *