New Delhi, April 19 -- OpenAI launched its new 3 and o4 mini reasoning models on Wednesday with many new features. Some enthusiastic OpenAI employees even went on to state that o3 had is nearing Artificial General Intelligence (AGI) - a technical term which has no fixed definition but is usually meant to believe a stage when AI achieves near or equivalent level of intelligence as humans. However, as it turns a new document by OpenAI itself proves that its new AI models are prone to not just hallucination (making stuff up), but even more hallucinations than its previous reasoning and non-reasoning models.
OpenAI had first rolled out its reasoning model last year which claims to mimic human level thinking in order to solve for more complex...
Click here to read full article from source
To read the full article or to get the complete feed from this publication, please
Contact Us.