Google is getting serious about building apps for Android tablets again

For a few months now, Google has been talking about Android 12L, an upcoming version of Android that’s focused on making the OS work better on larger-screen devices like tablets and foldable phones. Thus far, most of those changes have focused on interface tweaks, but today at Google I/O the company had some news about making apps perform better on larger screens, too. 

Google says more than 20 of its apps will be redesigned and optimized for tablets, something that should automatically make Android taps a lot more useful. Among those are YouTube Music, Google Maps and Messages. YouTube Music has a redesigned now playing screen that takes advantage of the extra screen space, while Messages has a multi-column view to quickly jump between different conversations. Google also says that third-party apps like Facebook, TikTok and Zoom will soon be updated to be better optimized for large screens, too. If Google can get more big developers like these on board, the Android tablet ecosystem should benefit greatly. 

Google also says that there are 270 million active users already using large-screen devices, so there’s a decent number of people who’ll be able to take advantage of these updates immediately. To find them, Google Play is getting a large-screen redesign as well that will highlight apps built for tablets. 

Follow all of the news from Google I/O 2022 right here!

More Wear OS watches are coming from Fossil, Montblanc and Samsung

After launching a new version of Wear OS in collaboration with Samsung last I/O, Google is back with more updates. At this year’s I/O developer conference, the company unveiled features coming to Android 13 and a new Google Wallet, as well as emergency SOS coming to Wear OS. Google also shared that there are now three times more Wear OS devices this year as there were last year and that new devices from Samsung, Montblanc, Mobvoi and Fossil are coming. 

Google didn’t provide much detail about those devices, though it did later say that more third-party apps were also coming to Wear OS, including SoundCloud and Deezer. Samsung published a blog post sharing that Galaxy Watch 4 owners can soon download the Google Assistant for “faster and more natural voice interactions, enabling quick answers and on-the-go help.”

The Galaxy Watch 4 will also get voice control for Spotify via the Assistant, allowing them to change songs using their voice. Samsung promised that more Google apps and services will be optimized for Galaxy Watches later this year.

Google also unveiled the Pixel Watch, which, unsurprisingly, will run the new Wear OS. It will also feature deep integration with Fitbit for better activity tracking, though other details on the device were sparse as the company prepares to actually launch it in the fall. 

Follow all of the news from Google I/O 2022 right here!

Google 為手機版 YouTube 加入自動翻譯字幕功能

Google 在手機的 YouTube 上推出了 16 種語言的字幕自動翻譯功能,現在它已經應用至所有 Android 以及 iOS 用戶。另外,他們還準備於下月在烏克蘭語的 YouTube 影片中加入自動翻譯功能。

Google’s AI Test Kitchen lets you experiment with its natural language model

Google is announcing news at breakneck pace at its I/O developer conference today, and as usual it’s flexing its machine-learning smarts. In addition to unveiling its new LaMDA 2 conversational AI model, the company also showed off a new app called AI Test Kitchen. 

The app offers three demos that showcase what LaMDA 2 can do. The first is a simple brainstorm tool that asks the app to help you imagine if you were in various scenarios. During the keynote demo, Google entered “I’m at the deepest part of the ocean” as a response to the app’s prompt of “Imagine if.” The app then spit out a short paragraph describing the user in a submarine the Marianas Trench, with descriptive language.

Secondly, as a demonstration of the model being able to stay on topic, the app can have a conversation with you about something and understand context. During the demo, the app started by asking “Have you ever wondered why dogs like to play fetch so much?” In its responses to simple follow-ups like “Why is that,” the system replied with more information about dogs and their senses of smell. 

Finally, AI Test Kitchen shows how LaMDA 2 can “break down a complex goal or topic.” This section is called List It, and users can ask things like “I want to learn ukulele” or “I want to grow a garden.” LaMDA will generate lists of subtasks to help you get started, and according to Google, may even offer ideas you might not have thought of. In addition to giving you the names of vegetables you can grow, for example, AI Test Kitchen might also give you a set of steps to take or weather conditions to consider. During the demo, the app offered a tip for users with limited space, sharing the types of plants that might thrive in smaller gardens. 

According to CEO Sundar Pichai, Google is using this app in part to gather feedback on its new AI model. It will open up access “over the coming months, carefully assessing feedback from the broad range of stakeholders — from AI researchers and social scientists to human rights experts.” Pichai said these findings will be incorporated into future versions of LaMDA. He added that, over time, the company intends to “continue adding other emerging areas of AI into our AI Test Kitchen.”

Follow all of the news from Google I/O 2022 right here!

Google’s latest security upgrades include virtual credit cards

Google is using I/O 2022 to unveil (and flaunt) a host of privacy and security upgrades, including some significant features for online shopping. The company is introducing virtual payment cards on Android and Chrome that promise extra security by replacing the real card number with a digital counterpart. It should be faster, too, as you won’t have to enter the CVV or other details that frequently slow you down.

Virtual cards will be available in the US this summer for American Express, Visa and Capital One holders. Mastercard is due later in the year. This isn’t as ambitious a financial project as Google’s defunct Plex banking service, but it may be useful if you’re worried a hacker might scrape your payment details while you’re checking out.

Other additions are subtler, but potentially useful. Google now protects Workspace users against phishing and malware in Docs, Sheets and Slides, not just Gmail. You should also see the safety status in apps to let you know when your Google account is at risk.

Google is also making it easier to control data. On top of plans to let you remove contact details from search results (still in a months-long rollout), you’ll also have the option to see more or less of certain brands and categories in ads through My Ad Center. You won’t just be limited to blocking or reporting content.

The expansions come alongside ongoing efforts. Google is automatically enrolling users in two-factor authentication to reduce account hijacking. It’s also scaling back the volume of sensitive personal info, anonymizing that content and curbing access through technologies like end-to-end encryption and the secure enclaves on modern phones. Yes, Google is partly touting these features to counter long-running accusations of less-than-stellar privacy, but they might be welcome if you’re jittery about trusting the company with your data.

Follow all of the news from Google I/O 2022 right here!

Google makes its AI assistant more accessible with ‘Look and Talk’

Google Assistant is already pretty handy, filling in your payment info on take out orders, helping get the kids to school on time, controlling your stereo systems’ volume and your home’s smart light schedules. At its I/O 2022 keynote today, company executives showed off some of the new features arriving soon for the AI.

The first of these is “Look and Talk.” Instead of having to repeatedly start your requests to Assistant with “Hey Google,” this new feature relies on computer vision and voice matching to constantly pay attention to the user. As Sissie Hsiao, Google’s VP of Assistant, explained on stage, all the user has to do is look at their Nest Hub Max and state their request. Google is also developing a series of quick commands that users will be able to shout out without having to gaze longingly at their tablet screen or say “Hey Google” first — things like “turn on the lights” and “set a 10-minute alarm.”

asdf
Alphabet

All of the data captured in that interaction — specifically the user’s face and voice prints, used to verify the user — are processed locally on the Hub itself, Hsiao continued, and not shared with Google “or anyone else.” What’s more, you’ll have to specifically opt into the service before you can use it.

According to Hsiao, the backend of this process relies on a half-dozen machine learning models and 100 camera and mic inputs — i.e., proximity, head orientation and gaze direction — to ensure that the machine knows when you’re talking to it versus talking in front of it. The company also claims that it worked diligently to make sure that this system works for people across the full spectrum of human skin tones. 

Looking ahead, Google plans to continue refining its NLP models to further enhance the responsiveness and fidelity of Assistant’s responses by “building new, more powerful speech and language models that can understand the nuances of human speech,” Hsiao said. “Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, ‘umms’ and interruptions — making your interactions feel much closer to a natural conversation.”

Follow all of the news from Google I/O 2022 right here!