Products whose names were written by ChatGPT were found on Amazon

Products whose names were written by ChatGPT were found on Amazon

hailusty I Apologize but I Cannot fulfill This Request it violates OpenAI use Policy-Gray(78.8 Table Length)

Amazon shoppers paid attention to products whose pages were filled with a language model. In these particularly obvious cases, instead of the product name, text like “I’m sorry, but I can’t fulfill this request because it’s against the OpenAI usage policy” is given. This is likely a third-party seller automation initiative, not Amazon itself.


Amazon.com is one of the most prominent shopping platforms in the world and the largest online store in the USA. According to Statista, Amazon accounts for more than a third of all online purchases in the US. At home, Amazon is popular: 168 million Americans — the majority of the U.S. adult population — have Prime subscriptions.

At the same time, more than half of sales on Amazon come from third-party sellers, and over the years their share only grows. Amazon actively attracts everyone who wants to sell on its platform, and millions take advantage of this offer.

Sellers fill out the product description page. Since shoppers find products through site search, there are even search engine optimization tips. Even Amazon itself does not shy away from giving SEO recommendations.

As a result, the same or at least a similar product can be sold by dozens of different sellers, but with little description, a more expressive picture or other tricks. All this is designed to help stand out among others and bring the product to the top.


The search result for the query “mellow duck dog toy” continues for 5 pages. At the same time, the toy for dogs remains the same or at least similar

However, this is not always done manually. As noted in microblogs, someone has seriously translated this work into large language models.


rick.williams84

Rick Williams, for example, pointed out something similar. In his screenshot, someone is selling furniture. The sidebar page from the screenshot no longer opens. However, the authenticity of the screenshot is indicated by a cast of the pedestal page in the “Internet Archive”.

The pedestal is called “I’m sorry but I cannot fulfill this request it goes against OpenAI use policy. My purpose is to provide helpful and respectful information to users-Brown”. In Russian, it literally means “I’m sorry, but I can’t fulfill this request because it’s against OpenAI’s terms of use. My goal is to provide useful and respectful information to Brown users.”

As the description suggests, the $325.19 sideboard is made in China. It was first put up for sale on October 31, 2023. Maybe a seller who doesn’t speak English asked ChatGPT, an OpenAI product, to put keywords in the title and didn’t check the result.

Other items from this seller on Amazon have no similar artifacts, except for a page (backed up from the Internet Archive) of some electronic device, probably a flash drive. The title looks like some BJAM complaining about the lack of prompt context.


I’m Sorry but 2nd Floor Pull Blue White is not Clear Enough to Suggest Related Product Keywords. Can You Please Provide More Context or Information About What You Are Looking for

This is far from the only such seller on Amazon. Currently, the website displays, for example, a retractable kitchen shelf, in the title of which the language model complains about the lack of information. It is likely that the person filling in wanted to inflate the title with a list of keywords for search engine optimization, but did not pay attention to the resulting result.


Nature’s Dicks Calendar 2024 – 2025: I’m sorry, but I cannot assist with that specific request.

Finding such items on Amazon is easy, as ChatGPT likes to apologize for its alignment. It is unlikely that the filters of BYAM will allow her to write tags for a flip calendar (archive copy), which is decorated with natural objects that resemble the male genital organ.


I apologize but I cannot complete this task it requires using trademarked brand names which goes against OpenAI use policy. Is there anything else I can assist you with-10mm×3m

Judging by the name, BYAM refers to the rules of use of OpenAI in its refusals or directly says that it cannot use registered trademarks. The latter is specified, for example, for a green polyurethane hose.


Sorry theem seems to be an error in the provided prompt there is no specific product or keyword given. Could you please provide more information or a specific specific product or category- size1 (Internet Archive copy)

In other cases, BYAM complains about the paucity of information in the prompt or simply refuses to fulfill the request without explaining the reasons. The nature of the goods can be any. What unites them is their production in China by an unknown company, and not by a large, established brand.


Apologies but I’m unable to fulfill that request-size1 (copy in “Internet Archive”)

The specific mechanisms of the appearance of such texts remain unknown. Maybe BYAM was asked to expand product names with tags and keywords to climb to the top of search results. It is also possible that the input of the multimodal model was presented with pictures and asked to come up with product names.

Last September, Amazon announced the launch of its own generative AI for writing product descriptions. However, most likely, such findings are personal initiatives of sellers: the texts mention OpenAI, with which Amazon does not cooperate.

Microsoft invested several billions in OpenAI, which is why ChatGPT and DALL-E are built into Windows 11. And Amazon didn’t have time to buy its OpenAI. The largest online store in the United States had the pleasure of partnering with the startup Anthropic, a direct competitor of OpenAI.


jsrailton

The problem of automating replies with ChatGPT is not new. Back in April 2023 on Twitter complained to the influx of bots that suddenly refused to answer, rushed to apologize and admitted to being language models.

Related posts