Surprisingly, video games have also become effective teaching tools for artificial intelligence.
Just like human beings, artificial intelligence needs to learn how to do things to become good at them. They must be taught how to drive cars or how to cook pizza before they can do these tasks.
In the gaming scene, here’s how scientists are using popular video games to enhance AI.
Although Tesla, Lyft and BMW all have self-driving cars, they aren’t perfect. And that means more work needs to be done to improve the AI behind them. To offer some help, a team of researchers at Princeton University are enhancing AI using the video game: Grand Theft Auto.
For the uninitiated, Grand Theft Auto features a virtual world in which people drive cars, follow and break traffic rules. They could also commit crimes galore. But that’s not the focus point of scientists.
Instead, the Princeton researchers tweaked GTA V to allow a program to play it. That way, it could learn how to read traffic signals, observe the weather and detect pedestrians effectively. If the study works, autonomous cars of the future will be much more advanced and safer than they are today.
Imagine being operated by a team of robotic surgeons, physicians and nurses. Some bots could be sterilizing equipment on the side while others converse on how to remove your appendix best.
It sounds scary. But thanks to a group of scientists from China using StarCraft II to train AI to play like a team, the robotic takeover could happen sooner rather than later. Of course, researchers have good intentions in training bots to work as a team.
But based on the current progress, it won’t be long before there are warehouses operated purely by AI robots. There’s one problem, though.
The technology teaching bots to work as a team isn’t advanced enough to teach one bot to be the supervisor. Again, some scientists doubt StarCraft can ever help robots learn how to cooperate.
Let’s face it. There’s isn’t a video game these days AI can’t be programmed to play and defeat humans. But until 2018, few bots could effectively beat an experienced Texas Hold’em player.
The new robot, named Pluribus, isn’t just effective at defeating a single player in poker. It can take on duos, three, four or five players at a go. What’s more, Pluribus has been tested to defeat humans for 10,000 hands.
And guess who won? The robot. Surprisingly, Pluribus became so good at poker by playing the game alone. The researchers who developed it don’t intend to use their ‘genius’ to defeat pros at poker, however.
As such, you can play poker, blackjack and other casino games without people using bots to defeat you. CasinoTopsOnline can help you find an excellent gambling website that welcome new comes with bonuses. More importantly, you want a platform that provides all your favourite games.
A smart robot not only does what it’s programmed to do but also learns from its surrounding. Unfortunately, a robot’s ability to learn is limited to how great its program is.
To counter that weakness, scientists at the University of California are using Minecraft to enhance AI’s learning skills. For the uninitiated, Minecraft is a game in which you can simulate virtually everything done in the real world.
You can build a house, bake cakes, workout or create a car. With the game as a backdrop, researchers hope they can teach AI how to solve simple and complex problems without requiring human intervention.
So far, AI has been successful in doing most tasks humans do in Minecraft. But considering real-world problems are different from those in video games, it will be interesting to find out how AI bots overcome real-world challenges.
One of the best things about video games is that you can tweak them to improve your experience. You can also make quick changes, switch difficulty levels or change gaming environments.
Unfortunately, developing a virtual gaming world in which players can customize multiple aspects takes months or years. In many cases, it also requires AI’s involvement.
Fortunately, scientists are working on AI to teach it what gamers love. In return, it can help improve virtual worlds, mechanics and scale gameplay to unprecedented levels.
Of course, AI won’t learn how to do these things out of nowhere. That’s why researchers regularly train bots how to play or develop games. And once they understand the basics, they can use that knowledge to improve more aspects of video games.
In a robotic world, how would bots know how to avoid blocks of wood? How would they walk downstairs or react when hit by a human? These are plausible situations robots must be taught.
But instead of having a human being teaching a single bot how to do these things, scientists are relying on video games. After all, no one has the time and patience to teach robots to do simple things.
Microsoft’s Project Malmo is a futuristic platform built to enhance artificial intelligence. It’s centred on Minecraft and aims to help AI learn how to make sense of complex problems. It also strives to improve human interactions with AI.
But how does Minecraft relate to teaching people how to interact with AI? The video game tasks you to simulate real-world situations through senses. And that’s a proper groundwork for advancing artificial intelligence.
What’s more, Minecraft offers unlimited opportunities to do simple and complex tasks. But just the same way it can interpret a virtual world; it could be trained to understand real-world issues. Of course, that’s a long-term project. But it’s a realistic and futuristic project that could impact the world tremendously.