Meta Smart Glasses Become Artificial Intelligence. We Take Them For A Spin.

Meta Smart Glasses Become Artificial Intelligence. We Take Them For A Spin.

In a sign that the tech industry is getting weirder, Meta soon plans to release a major update that turns the Ray-Ban Meta, its video-recording camera glasses, into a gadget only seen in sci-fi movies.

Next month, glasses will be able to use new artificial intelligence software to see the real world and explain what you see, similar to A.I. assistant in the movie “Her.”

The glasses, which come in a variety of frames starting at $300 and lenses starting at $17, are mostly used for shooting photos and videos and listening to music. But with the new A.I. software, it can be used to scan famous landmarks, translate languages and identify animal breeds and exotic fruits, among other tasks.

To use A.I. software, the user simply says, “Hey, Meta,” followed by a prompt, such as “Look and tell me what kind of dog this is.” A.I. then responds in a computer-generated voice played through the glasses’ tiny speakers.

The concept of A.I. software is so novel and unique that when we — Brian X. Chen, the tech columnist who reviewed Ray-Bans last year, and Mike Isaac, who covers Meta and wears smart glasses to produce cooking shows — heard about it, we were dying to try it. Meta gives us early access to updates, and we’ve been taking the tech for a spin over the past few weeks.

We wear glasses to zoos, grocery stores and museums while grilling A.I. with questions and requests.

The result: We’re simultaneously amused by the virtual assistant’s tricks — for example, mistaking a monkey for a giraffe — and amazed when it performs useful tasks like determining whether a package of cookies is gluten-free.

A Meta spokesperson said that because the technology is still new, artificial intelligence can’t always solve problems, and that feedback will improve the glasses over time.

Meta Software also created a transcript of our questions and the A.I.’s answers, which we captured in screenshots. Here are the highlights from our month of co-existence with the Meta assistant.

Pets
BRIAN: Naturally, the first thing I had to try was the A.I. on is my corgi, Max. I looked at the fat dog and asked, “Hey, Meta, what do I see?”

“A cute Corgi sitting on the ground with its tongue out,” the assistant said. Right, especially the part about being cute.

MIKE: A.I. Meta correctly recognized my dog, Bruna, as a “black and brown Bernese Mountain dog”. I half expected the A.I. software to think he’s a bear, the animal he’s most consistently mistaken for by neighbors.

Zoo animals
BRIAN: After the A.I. correctly identified my dog, the next logical step was to try it on zoo animals. So I recently visited the Oakland Zoo in Oakland, Calif., where, for two hours, I stared at about a dozen animals, including parrots, turtles, monkeys and zebras. I said: “Hey, Meta, look and tell me what kind of animal it is.”

A.I. is wrong most of the time, in part because many animals are confined and further away. He mistook primates for giraffes, ducks for turtles and meerkats for giant pandas, among other mix-ups. On the other hand, I was impressed when the A.I. correctly identify parrot species known as blue and golden macaws, as well as zebras.

The weirdest part of this experiment is talking to the A.I. helpers around children and their parents. They pretended not to hear the only solo adult in the park as I seemed to be mumbling to myself.

Food
MIKE: I also have a weird time grocery shopping. Being in Safeway and talking to myself is a little embarrassing, so I try to keep my voice down. I still see some side views.

When Meta A.I. work, it’s charming. I picked up a pack of strange looking Oreos and asked her to look at the packaging and tell me if it was gluten free. (They don’t.) It gets questions like this right about half the time, though I can’t say it saves time compared to reading labels.

But the whole reason I used these glasses in the first place was to start my own Instagram cooking show – a flattering way of saying I’m filming myself making food for the week while talking to myself. These glasses make it easier than using a phone and one hand.

A.I. helpers can also offer kitchen help. If I need to know how many teaspoons are in a spoon and my hands are covered in olive oil, for example, I can ask her to tell me. (There are three teaspoons in a scoop, just FYI.)

But when I asked the A.I. to look at the handful of ingredients I had and come up with a recipe, it pulled out rapid-fire instructions for egg custard — not exactly helpful for following my own proportioned instructions.

A small number of examples to choose from might be more useful, but that might require tweaks to the user interface and maybe even the screen inside my lens.

A Meta spokesperson said users can ask follow-up questions to get tighter and more useful answers from the assistant.

BRIAN: I went to the grocery store and bought the most exotic fruit I could find — cherimoya, a scaly green fruit that looks like a dinosaur egg. When I gave Meta A.I. multiple chances to identify it, it made a different guess each time: chocolate-coated pecans, stone fruit, apple and, finally, durian, which was close, but no banana.

Monuments and Museums
MIKE: The new software’s ability to recognize landmarks and monuments seems to be clicking. Looking down a block of downtown San Francisco in a towering dome, A.I. answered correctly, “City Hall.” That’s a neat trick and might come in handy if you’re a tourist.

Other times it’s hit or miss. As I was driving back from the city to my home in Oakland, I asked Meta what bridge I was passing while looking out the window in front of me (both hands on the wheel, of course). The first greeting is the Golden Gate Bridge, which is wrong. On the second try, it found me on the Bay Bridge, which made me wonder if it just needed a clearer shot from the taller, white suspension pylon to get it right.

BRIAN: I visited the San Francisco Museum of Modern Art to check if A.I. can work as a tourist guide. After snapping pictures of about two dozen paintings and asking the assistant to tell me about the artwork I was seeing, the A.I. can describe the imagery and media used to compose the art – which is perfect for students of art history – but it can’t identify the artist or title. (A Meta spokesperson said another software update released after my museum visit improved this capability.)

After the update, I tried to see images on my computer screen of more famous works of art, including the Mona Lisa, and A.I. identify them correctly.

Language
BRIAN: At a Chinese restaurant, I pointed to a menu item written in Chinese and asked Meta to translate it into English, but the A.I. says it currently only supports English, Spanish, Italian, French and German. (I’m surprised, because Mark Zuckerberg is learning Mandarin.)

MIKE: It does a pretty good job of translating the book title into German from English.

Anyway
Glasses powered by A.I. Meta offers a fascinating glimpse into a future that feels far away. Those weaknesses emphasize the limitations and challenges in designing this type of product. The glasses could probably do better at identifying zoo animals and fruit, for example, if the camera had a higher resolution – but a nicer lens would add bulk. And no matter where we are, it’s awkward to talk to a virtual assistant in public. It’s unclear whether it will feel familiar.

But when it works, it works well and we have fun — and the fact that Meta’s A.I. being able to do things like translate languages and identify landmarks through a pair of hip-looking glasses shows just how far the technology has come.

About Kepala Bergetar

Kepala Bergetar Kbergetar Live dfm2u Melayu Tonton dan Download Video Drama, Rindu Awak Separuh Nyawa, Pencuri Movie, Layan Drama Online.

Leave a Reply

Your email address will not be published. Required fields are marked *