"GPT-4o ... talk to users in a much more lifelike way — detecting emotions in their voices, analyzing their facial expressions and changing its own tone and cadence depending on what a user wants... It sounded more humanlike than some humans I know."
And it's fast: it's “native multimodal support” means it can "take in audio prompts and analyze them directly, without converting them to text...feels like a friendly, chatty co-worker... if OpenAI’s own employees can’t resist treating ChatGPT as a human, is it any mystery whether the rest of us will?".
More Stuff I Like
More Stuff tagged ai , authenticity , chatgpt , gpt-4o
See also: Communication Strategy , Content Strategy , Digital Transformation , Innovation Strategy , Politics , Communications Strategy , Science&Technology
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.