A new machine learning project from Intel Labs R&D shows what the future might look like for video and computer game graphics. Through a variety of algorithms, the company has managed to make the GTA V computer game more realistic.
It is the sometimes blasphemous “Grand Theft Auto V” gangster game taken by Intel and led through a machine learning network. In addition to engineering information from the game itself, the algorithms have also been fed with data from Cityscapes, a database of street photos in Germany, the equivalent Mapillary Vistas, and the Kitti research project.
Everything was analyzed by Intel’s algorithms to produce information about the geometry, what materials are in the images, and what the light looks like. The information is used to create new images, based on Hrnetv2 technology (the most advanced alternative to Hrnet, High Resolution Network). The image is processed with different resolutions and in various versions it is evaluated by two different algorithms.
Algorithms arrange different images
The first assesses the difference between the image the grid started with and the end result, and identifies the least different images. The other evaluates the realism of the images. Based on the images that the algorithms chose as the best, Intel researchers have put together a new computer game mapping.
The result is amazing: the game’s graphics get more realistic, and Intel itself calls it photorealistic. The slightly animated feel of many computer game graphics is replaced by what feels like a movie from any street. The difference is clearly visible in the animation, as in the video below:
Anyone wanting a more in-depth explanation of how technology works can read the researchers’ article “Enhanced photorealistic enhancement” is here.
“Entrepreneur. Freelance introvert. Creator. Passionate reader. Certified beer ninja. Food nerd.”
More Stories
EA President Talks New Dragon Age: 'A Return to What Made Bioware Great'
She thought she had bought a phone – she was shocked by its contents
Rumor: Lots of AI in Google's Pixel 10 and 11 cameras