MEET GT Sophy

WHY?

AI playing against and beating humans in video games is nothing new to players of classic games of all kinds. However, over the last couple of years, Sony AI has been training an AI named GT Sophy to play the popular racing game, Gran Turismo Sport.

The AI first hit the race track in July in the time trial portion of the game where no other cars are present. Three months later, GT Sophy was able to beat human players in high-speed virtual races. While AI has proven to be successful in games like chess for many years, this new experiment with GT Sophy has shown that AI is able to make more complex choices with the incorporation of gaming physics and environments. By using reinforcement learning, GT Sophy was able to teach itself the rules of racing while also taking into account the actions of other racers.

With this new technological advancement in AI, you should be on the lookout for more AI of this caliber in your favorite video games soon. Basically, get ready for a challenge!

According to Takuma Miyazono, world champion esports racer,

“I completely forgot that I was playing against an AI. It was really fun. I want to race with the
agent more in the future.”