So a simple question was present, such as “What are the different types of lighting for the kitchen?” It’s a pretty menial question. However, it’s how the question is heard, understood, and dealt with. That’s where the next-gen Bing app comes in.
The next-gen bing app had to understand the question, spoken in a more natural format vs a typical web search. There were more examples of this, but it’s nothing you would be too surprised about.
This step is obvious, Bing has to find the relevant information. However, it’s dealing with presenting the information is very dynamic. It pooled visual data such as images. It also generated a unique page of the key information. It then spoke a clear answer about that data. Much like Siri would do, but it seemed more fluid and accurate to me at least.
Clicking one of the images, they were then able to select any part of the image. Saw a lamp in the picture you like? Crop and it’ll find more just like it. Furthermore, it’ll find related information, such as for sale items, colour choices, and other contextual information.
It seems simple enough, but that’s the point. A fluid user experience, where the hardcore Nvidia tech is using deep-learning and AI to make it easy for you. What’s more, is you can demo this new search right now HERE.
Get more information on GTC 2019 here.
Studio quality f/2.4 24 mm* all-glass Elgato Prime Lens Sony® STARVIS™ Sensor optimized for indoor…
The large tower cooler relies on two 140 mm fans, six 6 mm heat pipes…
An Alternative to Convention - With the Low-key black back frame, Crona S features the…
Expand your setup with real Pagani genes - With the Asetek SimSports Pagani Huayra R…
Wireless gaming mouse with lightning bolts on scroll wheel Three connectivity modes: 2.4GHz wireless, Bluetooth…
Built for fast action Low input lag reduces time delay between devices to monitor SmartImage…