"GPT-4o ... talk to users in a much more lifelike way — detecting emotions in their voices, analyzing their facial expressions and changing its own tone and cadence depending on what a user wants... It sounded more humanlike than some humans I know."And it's fast: it's “native multimodal support” means it can "take in aud…
What happens when we make machines "that we can’t help but treat them as people"?While "GPT-4o doesn’t represent a huge leap ... it more than makes up in features that make it feel more human... an ability to “see” and respond to images and live video in real time, to respond conversationally ... to “read” human emotions from visual…
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.