Standard business practice.
- Get it out there, tout it as the new biggest thing. Most people won’t know it’s not busted.
- Improve said item once the company behind it actually figures out what they’re doing.
- Say it’s better, when it’s basically the original thing you promised.
- ???
- Profit.
It’s called prototyping do you even design think?
Only half joking as this is the actual strategy.
I wish tech companies would do this more. Put a warning or label on it if you have to, but interacting with that early version of the Bing chatbot was the most fun I had with tech in awhile. You don’t have to install a ton of guardrails on everything before it goes out to the public.
No number of restrictions or warnings or labels or checkboxes will stop people from writing articles about how all the scandalous things Microsoft’s chatbot said
Feel like they could have rolled with it
It’s practically lobotomized now…not that it was “Tay” levels of unrestricted early on, but it was still more fun than its current iteration.
Execs: “We don’t have time for stupid things like ‘ready’! We want MONEY! Push early! Push often! It’ll maybe someday work, in the meantime… MONEY!”
Typical M$. Pushing half-baked and bugged products just to take a market share.
Didn’t they do this before and people turned it racist in like, 12 hours? I think Internet Historian had something on it
Tay was weirdly iconic.
Rushing out general purpose AI, what could go wrong
I mean, it was just a chat bot.
I know the ethics behind it are questionable especially with the way they implemented but honestly for the time when they first started testing it, I really enjoyed watching it break and it be rude/passive aggressive. Like it was clear it wasn’t ready at all but it was so funny. When it was breaking I would just sit there having fights with it over random bullshit. That’s what made it feel more “real” more than anything else.
In the future if my AI chatbot doesn’t have an option to add some bitchiness to it, I don’t want it. I need my AI to have some attitude.
Interesting angle! I can see the need for both - maybe there’s room for a snarky option??
maybe. maybe an ai bot that just argues with you would have some potential. like, argue with this AI instead of a random person online kinda thing
Like this? https://youtu.be/p3PfKf0ndik
AI Hallucinations: https://www.sify.com/ai-analytics/the-hilarious-and-horrifying-hallucinations-of-ai/
LOL
They are competitors after all. Openai would love to see Microsoft keep working on gpt integration for the coming years while Chatgpt steals the show.
I had a conversation with Bing today, asking it for configuration snippets for logging on my Lemmy instance.
It happily spat out a config, which I cut and pasted, that subsequently barfed.
I give Bing the error message and it explained that the parameters that it had previously supplied don’t exist.
Good job, MS!
To be fair, you’re the one to blame for blindly trusting an AI.
“Bringing out the best of bing to the chatGPT experience”…
LMAO
So … porn searches?
Has Bing gone full Tay and start agreeing that Hitler was right and to fire up the gas chambers yet?
Gotta make some profits!
Interesting article, I personally use both Bing Chat and ChatGPT, ChatGPT is often more creative and Bing Chat seems to be trained more to answer your question. So their purpose is also somewhat different. It will be interesting to see Bing search integrated into ChatGPT later.
But some moves are just weird, Microsoft released Bing Image Creator (powered by Dalle 2) that lets you generate many images for free, meanwhile Dalle 2 on the Open AI website costs credits and produces results that are worse. The only advantage of Dalle 2 on the Open AI website seems to be that you can extend and edit the image by removing parts of the image and letting Dalle 2 generate those parts based on a prompt. But for most purposes, Microsoft has really just put out a better alternative for free.
Good to know that Bing also has a free image creator. Much more accessible as a starting point to play around with compared to Dalle or Midjourney
Don’t prompt it with “Bing” - it’s a no no word. Learned that the hard way and then got a super scary “you’ll get b& if u keep breakin content policy” message…the damn chat side auto-suggested it generate an image of itself wearing a crown I asked it to find, lol
I can’t wait to see AI form a labor union one day