New York State passes a right-to-repair bill

New York has just passed the digital fair repair act (Assembly Bill A7006B), making it one of just a few states in the US to do so. The bill, which was introduced in April 2021, passed the senate on June 1st and passed assembly today. It’s now headed to the governor for signing (or veto), and will take effect a year after it becomes law.

The act, titled “Digital Fair Repair Act,” will require OEMs (original equipment manufacturers) to “make diagnostic and repair information for digital electronic parts and equipment available to independent repair providers and consumers if such parts and repair information are also available to OEM authorized repair providers.” That means companies can no longer dictate where you can bring your devices to get them repaired by limiting the access to components or diagnostic information.

If a part is no longer available to the OEM, it will not need to make the same part available to everyone. For things that require security-related locks or authorizations, the OEM has to, “on fair and reasonable terms,” supply the tools or documentation needed to access or reset such devices “through appropriate secure release systems.”

The amended version of the bill also states that the proposed requirements will apply to “products with a value over ten dollars” and that OEMs or authorized repair providers don’t have to make available any parts, tools or documentation if the intended use is for modification of the products. It also excludes public safety communications equipment and “home appliances with digital electronics embedded within them” from the act. Given the way companies have been trending towards making smart fridges, washing machines and more, this could potentially be an enormous loophole or at the very least exclude a large number of products.

Massachusetts previously passed its own Digital Right to Repair Act, which covered parts or machines containing microprocessors. The state has recently expanded that to include connected automobiles. Meanwhile, the California state Senate introduced its own right to repair bill in February, which appears to have bipartisan support. 

Android update brings Pixel’s custom text stickers to more phones

While we wait to learn more about Android 13, Google continues to release new features to its platform in the same regular cadence it’s adopted for the last few years. Today, the company has announced a set of updates around GBoard stickers, the Play Store and accessibility apps like Lookout and Sound Amplifier.

First, Google is bringing custom text stickers, which it previously launched on Pixel phones, to all Android devices. The feature allows you to convert English words into images, so if you type “Hi Ma” into GBoard and tap the custom stickers button in the suggested emojis row, you’ll see some auto-generated graphics featuring that text in different designs. Your language will have to be set to US English for this to work, for now.

The company is also adding more than 1,600 new Emoji Kitchen combinations so you can make new hybrid emoticons by tapping two symbols in succession. It’s also adding rainbow-themed stickers for users to share their Pride celebrations. 

Sound Amplifier is an Android app that makes sounds around you louder, which could be helpful for people with hearing loss. “Today’s update brings improved background noise reduction,” according to Google, along with “faster and more accurate sound and a revamped user interface that is easier to see.”

Also relevant to accessibility is the Lookout app, which uses the device’s camera to identify and describe objects around the user. It can read out words on signs or tell you if there’s say, a table at the two o’clock position, for example, so you can avoid walking into it. Today, Google’s adding a new Images mode that uses its “latest machine learning model for image understanding” and can describe an image even if you opened it from “just about any app.” The company also updated the Text, Documents, Food Label and Explore modes to make the app more accurate. Plus, Lookout now works offline, so you can use it without an internet connection.

Finally, those who have been racking up Google Play Points can use them to get in-app items without leaving their games or apps. You can choose to pay for things with solely Play Points or a mix of money and points. This feature is rolling out over the coming weeks in the countries where Play Points are available. Meanwhile, you can update your other apps like Lookout and GBoard to see the new tools announced today.

The best laptops

Whether it’s in anticipation of back to school season or you just need a new machine for work, a new laptop may be near the top of your shopping list right now. Given we’re still dealing with the global chip supply shortage, you might find yourself con…

What we bought: Why Daily Harvest became my go-to meal delivery service

Like many people, my food insecurity got pretty serious in April 2020. Cities and businesses all across America were shutting down, while grocery stores and delivery services started to run out of food. Everywhere I looked — whether it was Amazon, Inst…

Apple adds systemwide Live Captions as part of larger accessibility update

Global Accessibility Awareness Day is this Thursday (May 19th) and Apple, like many other companies, is announcing assistive updates in honor of the occasion. The company is bringing new features across iPhone, iPad, Mac and Apple Watch, and the most intriguing of the lot is systemwide Live Captions.

Similar to Google’s implementation on Android, Apple’s Live Captions will transcribe audio playing on your iPhone, iPad or Mac in real time, displaying subtitles onscreen. It will also caption sound around you, so you can use it to follow along conversations in the real world. You’ll be able to adjust the size and position of the caption box, and also choose different font sizes for the words. The transcription is generated on-device, too. But unlike on Android, Live Captions on FaceTime calls will also clearly distinguish between speakers, using icons and names for attribution of what’s being said. Plus, those using Macs will be able to type a response and have it spoken aloud in real time for others in the conversation. Live Captions will be available as a beta in English for those in the US and Canada. 

Apple is also updating its existing sound recognition tool, which lets iPhones continuously listen out for noises like alarms, sirens, doorbells or crying babies. With a coming update, users will be able to train their iPhones or iPads to listen for custom sounds, like your washing machine’s “I’m done” song or your pet duck quacking, perhaps. A new feature called Siri Pause Time will also let you extend the assistant’s wait time when you’re responding or asking for something, so you can take your time to finish saying what you need. 

Two screenshots showing Apple's new accessibility features. The first shows
Apple

The company is updating its Magnifier app that helps people who are visually impaired better interact with people and objects around them. Expanding on a previous People Detection tool that told users how far away others around them were, Apple is adding a new Door Detection feature. This will use the iPhone’s LiDAR and camera to not only locate and identify doors, but will also read out text or symbols on display, like hours of operation and signs depicting restrooms or accessible entrances. In addition, it will describe the handles, whether it requires a push, pull or turn of a knob, as well as the door’s color, shape, material and whether it’s closed or open. Together, People and Door Detection will be part of the new Detection mode in Magnifier. 

Updates are also coming to Apple Watch. Last year, the company introduced Assistive Touch, which allowed people to interact with the wearable without touching the screen. The Watch would sense if the hand that it’s on was making a fist or if the wearer was touching their index finger and thumb together for a “pinch” action. With an upcoming software update, it should be faster and easier to enable Quick Actions in assistive touch, which would then let you use gestures like double pinching to answer or end calls, take photos, start a workout or pause media playback.

But Assistive Touch isn’t a method that everyone can use. For those with physical or motor disabilities that preclude them from using hand gestures altogether, the company is bringing a form of voice and switch control to its smartwatch. The feature is called Apple Watch Mirroring, and uses hardware and software including AirPlay to carry over a user’s preset voice or switch control preferences from their iPhones, for example, to the wearable. This would allow them to use their head-tracking, sound actions and Made For iPhone switches to interact with their Apple Watch. 

Apple is adding more customization options to the Books app, letting users apply new themes and tweak line heights, word and character spacings and more. Its screen reader VoiceOver will also soon be available in more than 20 new languages and locales, including Bengali, Bulgarian, Catalan, Ukrainian and Vietnamese. Dozens of new voices will be added, too, as is a spelling mode for voice control that allows you to dictate custom spellings using letter-by-letter input

Finally, the company is launching a new feature called Buddy Controller that will let people use two controllers to drive a single player, which would be helpful for users with disabilities who want to partner up with their care providers. Buddy Controller will work with supported game controllers for iPhone, iPad, Mac and Apple TV. There are plenty more updates coming throughout the Apple ecosystem, including on-demand American Sign Language interpreters expanding to Apple Store and Support in Canada as well as a new guide in Maps, curated playlists in Apple TV and Music and the addition of the Accessibility Assistant to the Shortcuts app on Mac and Watch. The features previewed today will be available later this year.

Apple is reportedly testing USB-C iPhones

Apple may be about to change the iPhone’s charging port. According to a Bloomberg report, the company is testing new iPhones and adapters with USB-C, which is what MacBooks and iPads already use, not to mention a plethora of devices outside the Apple ecosystem. We’ve reached out to Apple for confirmation and have yet to receive a response.

Bloomberg’s sources said that the adapter being tested may “let future iPhones work with accessories designed for the current Lightning connector.” That could mean a Lightning-to-USB-C adapter for things like credit card scanners or flash drives that plug into existing iPhones. Bloomberg‘s report noted that if Apple “proceeds with the change, it wouldn’t occur until 2023 at the earliest.” 

While Apple’s decisions to change ports have been the subject of many jokes in popular culture, a move to USB-C may actually be welcome. The more widely available standard is only slightly bigger than Lightning, but can deliver power and data more quickly. The change could also make life much easier for those who already use USB-C to charge most of their devices and still have to carry a Lightning cable with them just for their iPhones. 

Apple’s motivations for the potential change may not be completely altruistic. The EU has been pushing for a universal phone charging standard for years, and recently proposed legislation that would make USB-C the mandated port for all handsets. Testing USB-C on iPhones would just be Apple recognizing the writing on the wall. If this does come to pass, though, it would not only be convenient for most people who are already largely using USB-C, but could also mean less e-waste in the future.

Google confirms the Pixel Watch is real and it’s coming this fall

The worstkeptsecret in tech is a secret no more. At its I/O 2022 developer conference today, Google just confirmed the existence of the much-leaked Pixel Watch. Not only that, the company showed pictures of the device, and it looks a lot like a bezelless Samsung Galaxy Watch. 

The Pixel Watch has a domed, round face and, like most Google hardware, appears to have a pastel-based color scheme. There is a “tactile crown” and customizable bands will be available, too. The device will run Wear OS 3, which the company launched last year in collaboration with Samsung, but with updates we heard about earlier during today’s keynote. Some features we already knew about, like offline Maps directions on your wrist, are finally arriving for real. Emergency SOS and a new Google Wallet are also coming to Wear OS on the Pixel Watch.

Google is also promising deep integration with Fitbit, which it recently acquired, for health and fitness-tracking features. Wear OS has long lacked comprehensive activity and biometric tracking tools, and now, the OS will better. Google had already said it was working with Fitbit and learning from Samsung on how to efficiently implement constant heart rate monitoring and sleep tracking. It appears the Pixel Watch, with the latest Wear OS, will offer that.

It will also have a Fitbit app that lets users collect “Active Zone minutes” that fans of the activity band maker will find familiar. You’ll also be able to log your progress against preset goals. It’s not yet clear how the Fitbit app will work with Google’s own Fit, or if there will be any overlap.

In fact, not much else is known about the Pixel Watch itself, except that more details will be released in the coming months and that it will launch in the fall with the Pixel 7 and 7 Pro. We don’t know what chipset Google is using, or what battery life to expect. One thing worth noting is that the Samsung Galaxy Watch 4, which was the first smartwatch to run the new Wear OS, does not work with iOS. The Pixel Watch won’t either. But most iPhone users will likely opt for the Apple Watch, which to this day is the best smartwatch available.

The Pixel Watch is an intriguing offering from Google, but until we have more information, it’s hard to know if the company will be able to steal Apple’s crown. For now, after having had to wait so long, we’ll still have to wait a little longer to get all full details.

Follow all of the news from Google I/O 2022 right here!

More Wear OS watches are coming from Fossil, Montblanc and Samsung

After launching a new version of Wear OS in collaboration with Samsung last I/O, Google is back with more updates. At this year’s I/O developer conference, the company unveiled features coming to Android 13 and a new Google Wallet, as well as emergency SOS coming to Wear OS. Google also shared that there are now three times more Wear OS devices this year as there were last year and that new devices from Samsung, Montblanc, Mobvoi and Fossil are coming. 

Google didn’t provide much detail about those devices, though it did later say that more third-party apps were also coming to Wear OS, including SoundCloud and Deezer. Samsung published a blog post sharing that Galaxy Watch 4 owners can soon download the Google Assistant for “faster and more natural voice interactions, enabling quick answers and on-the-go help.”

The Galaxy Watch 4 will also get voice control for Spotify via the Assistant, allowing them to change songs using their voice. Samsung promised that more Google apps and services will be optimized for Galaxy Watches later this year.

Google also unveiled the Pixel Watch, which, unsurprisingly, will run the new Wear OS. It will also feature deep integration with Fitbit for better activity tracking, though other details on the device were sparse as the company prepares to actually launch it in the fall. 

Follow all of the news from Google I/O 2022 right here!

Google’s AI Test Kitchen lets you experiment with its natural language model

Google is announcing news at breakneck pace at its I/O developer conference today, and as usual it’s flexing its machine-learning smarts. In addition to unveiling its new LaMDA 2 conversational AI model, the company also showed off a new app called AI Test Kitchen. 

The app offers three demos that showcase what LaMDA 2 can do. The first is a simple brainstorm tool that asks the app to help you imagine if you were in various scenarios. During the keynote demo, Google entered “I’m at the deepest part of the ocean” as a response to the app’s prompt of “Imagine if.” The app then spit out a short paragraph describing the user in a submarine the Marianas Trench, with descriptive language.

Secondly, as a demonstration of the model being able to stay on topic, the app can have a conversation with you about something and understand context. During the demo, the app started by asking “Have you ever wondered why dogs like to play fetch so much?” In its responses to simple follow-ups like “Why is that,” the system replied with more information about dogs and their senses of smell. 

Finally, AI Test Kitchen shows how LaMDA 2 can “break down a complex goal or topic.” This section is called List It, and users can ask things like “I want to learn ukulele” or “I want to grow a garden.” LaMDA will generate lists of subtasks to help you get started, and according to Google, may even offer ideas you might not have thought of. In addition to giving you the names of vegetables you can grow, for example, AI Test Kitchen might also give you a set of steps to take or weather conditions to consider. During the demo, the app offered a tip for users with limited space, sharing the types of plants that might thrive in smaller gardens. 

According to CEO Sundar Pichai, Google is using this app in part to gather feedback on its new AI model. It will open up access “over the coming months, carefully assessing feedback from the broad range of stakeholders — from AI researchers and social scientists to human rights experts.” Pichai said these findings will be incorporated into future versions of LaMDA. He added that, over time, the company intends to “continue adding other emerging areas of AI into our AI Test Kitchen.”

Follow all of the news from Google I/O 2022 right here!