Google's method of remotely bricking leaked/stolen phones (like the Pixel 7a that recently made the rounds) is actually open source!
They generate what's called a "brick OTA" that wipes the following partitions:
vbmeta
vbmeta_a
vbmeta_b
vbmeta_system_a
vbmeta_system_b
boot
boot_a
boot_b
vendor_boot
vendor_boot_a
vendor_boot_b
init_boot
metadata
super
userdata
This "brick OTA" is pushed to devices via GOTA (Google OTA) and can be installed on both "test-keys" and "release-keys" builds, but a serial number is required before a "brick OTA" can be installed on "release-keys" builds.
Many OEMs don't use GOTA, but a lot of them now do use update_engine, so Google's new automated "Android Brick OTA generator" tool might find some use outside of Google.
They generate what's called a "brick OTA" that wipes the following partitions:
vbmeta
vbmeta_a
vbmeta_b
vbmeta_system_a
vbmeta_system_b
boot
boot_a
boot_b
vendor_boot
vendor_boot_a
vendor_boot_b
init_boot
metadata
super
userdata
This "brick OTA" is pushed to devices via GOTA (Google OTA) and can be installed on both "test-keys" and "release-keys" builds, but a serial number is required before a "brick OTA" can be installed on "release-keys" builds.
Many OEMs don't use GOTA, but a lot of them now do use update_engine, so Google's new automated "Android Brick OTA generator" tool might find some use outside of Google.
π€―36π8π€5π₯3β€1π€¬1
The Linux kernel's MGLRU feature will be enabled by default for all Android 14 kernels! (android14-5.15 and android14-6.1).
Benchmarks have shown that with MGLRU, overall app launch times improve, there are fewer overall process kills, kswapd CPU use decreases, etc.
"MGLRU has been tested and edge cases addressed on Android workloads; after which the MGLRU showed good results across various performance metrics. Enable the MGLRU as default memory reclaim in algorithm."
You can check if your kernel is compiled with MGLRU support (and whether it's enabled) with this command:
Google has been A/B testing this feature for the past few months, making it something they can enable using a DeviceConfig flag that can be enabled server-side, but AFAIK they haven't enabled this feature on any devices in production, though all Tensor Pixels can support it.
For more details on MGLRU, check out this earlier thread.
Benchmarks have shown that with MGLRU, overall app launch times improve, there are fewer overall process kills, kswapd CPU use decreases, etc.
"MGLRU has been tested and edge cases addressed on Android workloads; after which the MGLRU showed good results across various performance metrics. Enable the MGLRU as default memory reclaim in algorithm."
You can check if your kernel is compiled with MGLRU support (and whether it's enabled) with this command:
adb shell "cat /proc/config.gz | gunzip | grep 'CONFIG_LRU_GEN'"CONFIG_LRU_GEN=y means it's available but not enabledCONFIG_LRU_GEN_ENABLED=y means it's enabledGoogle has been A/B testing this feature for the past few months, making it something they can enable using a DeviceConfig flag that can be enabled server-side, but AFAIK they haven't enabled this feature on any devices in production, though all Tensor Pixels can support it.
For more details on MGLRU, check out this earlier thread.
π₯17π5π1
Google has just released Android 13 QPR3 Beta 1! This is the first beta release for what will be Android 13's final QPR that will be released to users in June 2023.
This QPR will have the least amount of user-facing changes of all but should be the most stable release of Android 13 yet.
You may see some of the minor QoL features/changes that were first seen in the Android 14 DP, so don't be surprised.
This QPR will have the least amount of user-facing changes of all but should be the most stable release of Android 13 yet.
You may see some of the minor QoL features/changes that were first seen in the Android 14 DP, so don't be surprised.
π₯19π8β€3π1
Mishaal's Android News Feed
Google has just released Android 13 QPR3 Beta 1! This is the first beta release for what will be Android 13's final QPR that will be released to users in June 2023. This QPR will have the least amount of user-facing changes of all but should be the most stableβ¦
Follow this thread on Twitter to see what's new in Android 13 QPR3 Beta 1.
β€16π2π1
I got so bogged down in details that I sat on this story for months, but in the interest of the community, I'd like to confirm that the encoders for Qualcomm's proprietary aptX and aptX HD Bluetooth codecs are now part of AOSP.
Here's what this means for Android π
A few months back, I spotted a patch submitted to AOSP by a Qcom engineer called "add encoder for aptX and encoder for aptX HD source code." aptX and aptX HD codecs are proprietary to Qualcomm, so OEMs would previously acquire them directly from them.
I don't know what, if any, certification programs OEMs had to complete, or how much in licensing fees they had to pay, to obtain permission from Qualcomm to ship aptX and aptX HD encoders in their Android products. One article says that at least in 2014, there was a $6,000 one-time payment and ~$1 per-device fee for batches of up to 10k devices. That info came from Silicon Laboratories, a fabless semiconductor firm that designs many Bluetooth products.
The Wiki article on aptX, before my edit, falsely claimed that the aptX & aptX HD encoders were added to AOSP in A10 and could be freely used by OEMs. That was NOT true at the time. The claim was added in Dec. 2019 but was a misreading of a SoundGuys article on BT codecs.
What is true is that since Android 8.0, the Bluetooth A2DP stack added support for loading AAC, aptX, aptX HD, and LDAC codecs IF they were present in the build. As noted in the docs, "device manufacturers may need to obtain separate licenses and binary blobs for some proprietary audio codecs."
This is where I fell down a rabbit hole of licensing and patents. AAC isn't free? LDAC needs certification? Etc. I'm done with that rabbit hole. I even briefly tried using ChatGPT to help me draft some of this article, but the tone/voice just didn't feel right to me, lol.
Anyway, when it comes to aptX/HD, OEMs would obtain encoders compatible with AOSP right from Qualcomm.
eg. on Pixel phones prior to Android 13 QPR2, you could find precompiled aptX and aptX HD encoders as shared libraries in /system_ext/lib64. Starting in Android 13 QPR2, those shared libraries are no longer there, as they are compiled statically into the Bluetooth APEX.
I believe that going forward, any OEM that ships the Bluetooth APEX in their AOSP-based project will have the aptX and aptX HD encoders available in their end product by default.
There is even a new MTS test (Mainline Test Suite) test for this. Of course, just because there is source code for something doesn't make it open source. That depends on the license.
To clarify this, I reached out directly to Qualcomm a few months back, and got the following statement (for context, in early Nov. 2022):
Here's what this means for Android π
A few months back, I spotted a patch submitted to AOSP by a Qcom engineer called "add encoder for aptX and encoder for aptX HD source code." aptX and aptX HD codecs are proprietary to Qualcomm, so OEMs would previously acquire them directly from them.
I don't know what, if any, certification programs OEMs had to complete, or how much in licensing fees they had to pay, to obtain permission from Qualcomm to ship aptX and aptX HD encoders in their Android products. One article says that at least in 2014, there was a $6,000 one-time payment and ~$1 per-device fee for batches of up to 10k devices. That info came from Silicon Laboratories, a fabless semiconductor firm that designs many Bluetooth products.
The Wiki article on aptX, before my edit, falsely claimed that the aptX & aptX HD encoders were added to AOSP in A10 and could be freely used by OEMs. That was NOT true at the time. The claim was added in Dec. 2019 but was a misreading of a SoundGuys article on BT codecs.
What is true is that since Android 8.0, the Bluetooth A2DP stack added support for loading AAC, aptX, aptX HD, and LDAC codecs IF they were present in the build. As noted in the docs, "device manufacturers may need to obtain separate licenses and binary blobs for some proprietary audio codecs."
This is where I fell down a rabbit hole of licensing and patents. AAC isn't free? LDAC needs certification? Etc. I'm done with that rabbit hole. I even briefly tried using ChatGPT to help me draft some of this article, but the tone/voice just didn't feel right to me, lol.
Anyway, when it comes to aptX/HD, OEMs would obtain encoders compatible with AOSP right from Qualcomm.
eg. on Pixel phones prior to Android 13 QPR2, you could find precompiled aptX and aptX HD encoders as shared libraries in /system_ext/lib64. Starting in Android 13 QPR2, those shared libraries are no longer there, as they are compiled statically into the Bluetooth APEX.
I believe that going forward, any OEM that ships the Bluetooth APEX in their AOSP-based project will have the aptX and aptX HD encoders available in their end product by default.
There is even a new MTS test (Mainline Test Suite) test for this. Of course, just because there is source code for something doesn't make it open source. That depends on the license.
To clarify this, I reached out directly to Qualcomm a few months back, and got the following statement (for context, in early Nov. 2022):
βWe made the decision a few months back to include the encoders for classic aptX and aptX HD in the Android Open Source Project. The technology is known the world over as THE superior audio codec for wireless BT audio, and we have worked closely with both Google and individual OEMβs over the years to include these codecs in Android-based products. We are excited to say that under license from Qualcomm, these encoders are indeed now available under AOSP pursuant to the CLAβs in place.And just to be extra extra sure, I asked Qualcomm to explicitly name the license and what products are covered:
As aptX codecs continue to be the leader in delivering superior audio quality, and with our introduction of Snapdragon Sound in March 2021, Qualcomm is committed to ensuring we deliver premium audio experiences, the lowest possible latencies, and the best connectivity solutions available. As per our usual business processes, the licensing of aptX, Snapdragon Sound and the underlying technologies, will continue to remain unchanged apart from contributing the aptX and aptX HD encoders to AOSP."
π16β€4π₯3
"There are inbound and outbound licenses to the project of course. Once officially approved by Google - which we expect in the coming days since these were just recently submitted - the encoders will be offered under the outbound AOSP (Apache) license.β
βThe purpose of the contribution is to enable people to distribute the encoders as part of their finished products. The only Qualcomm products included in this release for Android are aptX and aptX HD ENCODERS. All other aptX products require a license direct from Qualcomm."
So there you go, you no longer need to undergo certification or pay a licensing fee to Qualcomm if you want to include an aptX and/or aptX HD encoder in your ANDROID product, so long as you utilize the code in AOSP licensed under Apache v2.0.This is great news for hobbyist custom ROM developers as well, as previously they'd have to just rip the shared libraries from a precompiled build (with questionable legality). Now you can just compile them from AOSP sources.
And yes, they do work as shared libraries if you change the blueprint to compile them as such.
Here's the source code in AOSP if you're interested:
Encoder for aptX | Encoder for aptX High Definition
Since I know most of you have devices that support aptX and aptX HD already, you're wondering why you should care. Well, for most users, this doesn't matter. It's a change that affects OEMs for the most part.
Surprisingly there are some devices that don't have either aptX or aptX HD, though. My NVIDIA SHIELD TV, for example, only supports aptX but not aptX HD.
π22β€2
Several users are reporting that the "ring & notification volume" sliders have been suddenly split into "ring volume" and "notification volume" in Android 14 DP2. This is WITHOUT them flipping any flags, which was previously required to enable separate ring/notification volume.
Google added this feature in Android 13 QPR2, as I first reported in December. The feature is gated by a DeviceConfig flag ("volume_separate_notification" under the "systemui" namespace), but it appears Google remotely toggled the flag for users on DP2.
If you are on Android 13 QPR2 or later, you can manually enable this feature by sending the following shell command:
You can view the code changes implementing this feature in AOSP: [1] [2] [3]
Google could also add a settings toggle in a future Android 13 QPR3 beta, since it's not like this feature requires making any changes to the API surface which is frozen. I'll keep an eye out.
Please keep this feature around and add a setting toggle for it, Google. People clearly want this feature!
β-
EDIT: The DeviceConfig flag was removed in the stable QPR2 release. It is present in QPR3 Beta 1 and Android 14 DP1/DP2, however. H/T Aswin A S
Google added this feature in Android 13 QPR2, as I first reported in December. The feature is gated by a DeviceConfig flag ("volume_separate_notification" under the "systemui" namespace), but it appears Google remotely toggled the flag for users on DP2.
If you are on Android 13 QPR2 or later, you can manually enable this feature by sending the following shell command:
device_config put systemui volume_separate_notification true
It's possible this was a mistake and will be reverted soon. It's hard to tell, because it was a server-side change with no announcement from Google. If Android 14 adds a settings toggle to enable/disable linking the volumes, then we'll know if it's intentional or not.You can view the code changes implementing this feature in AOSP: [1] [2] [3]
Google could also add a settings toggle in a future Android 13 QPR3 beta, since it's not like this feature requires making any changes to the API surface which is frozen. I'll keep an eye out.
Please keep this feature around and add a setting toggle for it, Google. People clearly want this feature!
β-
EDIT: The DeviceConfig flag was removed in the stable QPR2 release. It is present in QPR3 Beta 1 and Android 14 DP1/DP2, however. H/T Aswin A S
π45π2
Google Play is unifying the dates for its target API level requirements.
Starting August 31, 2023:
- New apps and app updates must target API level 33 (Android 13) to be submitted to Google Play. (Wear OS apps must target API level 30 [Android 11])
- Existing apps must target API level 31 or above to be discoverable by all users on Google Play. Apps that target API level 30 or below will only be discoverable on devices running Android versions the same or lower than your apps' target API level. (Wear OS apps must target API level 29 or below to remain discoverable.)
If you need more time to get ready for this change, you can request an extension to November 1, 2023. You'll be able to access the form to request an extension through the Play Console later this year.
Previously, new apps had to target current_release-1 by Aug. 1 of the current year, while app updates had to target current_release-1 by Nov. 1. Now those two dates are unified as well as Google's new app discoverability requirement.
The app discoverability requirement was supposed to go into effect on Nov. 1st of last year but got delayed to Jan. 31 of this year. Now it's starting Aug. 31 of this year, but some users already report they're unable to see apps targeting older versions.
In case you missed it, Google announced this change in an email sent to developers on Google Play yesterday. More details and a FAQ can be found in this support post.
Starting August 31, 2023:
- New apps and app updates must target API level 33 (Android 13) to be submitted to Google Play. (Wear OS apps must target API level 30 [Android 11])
- Existing apps must target API level 31 or above to be discoverable by all users on Google Play. Apps that target API level 30 or below will only be discoverable on devices running Android versions the same or lower than your apps' target API level. (Wear OS apps must target API level 29 or below to remain discoverable.)
If you need more time to get ready for this change, you can request an extension to November 1, 2023. You'll be able to access the form to request an extension through the Play Console later this year.
Previously, new apps had to target current_release-1 by Aug. 1 of the current year, while app updates had to target current_release-1 by Nov. 1. Now those two dates are unified as well as Google's new app discoverability requirement.
The app discoverability requirement was supposed to go into effect on Nov. 1st of last year but got delayed to Jan. 31 of this year. Now it's starting Aug. 31 of this year, but some users already report they're unable to see apps targeting older versions.
In case you missed it, Google announced this change in an email sent to developers on Google Play yesterday. More details and a FAQ can be found in this support post.
π31β€2
Xiaomi, OPPO, and Vivo are collaborating to make setting up a new Android phone easier when you switch brands. Phones from these brands will support migrating "system data" like "photos and contacts" and - more importantly - third-party app data between one another.
This collab was announced by Xiaomi on Weibo (H/T @AndroidPolice), so it's unclear if this will apply to each company's respective global software versions. Still, setting up a new Android phone can be a pain, especially when you're switching brands, so this is a welcome change.
When you buy a new phone from the same brand that made your old one, chances are that brand's proprietary data transfer tool will move almost everything over. That's not the case when switching brands, as only a limited set of data (usually excluding 3P app data) is transferred.
Full data backups and restores is something Android has lacked for a LONG time, but it's not that Android can't do it (GMS certainly could, and Titanium Backup has existed since forever)...it's ironing out the complications that arise when it is done.
Lots of apps make certain assumptions/have certain configs that apply to certain kinds of devices/devices from specific brands. Or there's a specific per-device token that needs to be reissued (just look up threads on broken push notifications after restoring data using Titanium Backup.)
This collab was announced by Xiaomi on Weibo (H/T @AndroidPolice), so it's unclear if this will apply to each company's respective global software versions. Still, setting up a new Android phone can be a pain, especially when you're switching brands, so this is a welcome change.
When you buy a new phone from the same brand that made your old one, chances are that brand's proprietary data transfer tool will move almost everything over. That's not the case when switching brands, as only a limited set of data (usually excluding 3P app data) is transferred.
Full data backups and restores is something Android has lacked for a LONG time, but it's not that Android can't do it (GMS certainly could, and Titanium Backup has existed since forever)...it's ironing out the complications that arise when it is done.
Lots of apps make certain assumptions/have certain configs that apply to certain kinds of devices/devices from specific brands. Or there's a specific per-device token that needs to be reissued (just look up threads on broken push notifications after restoring data using Titanium Backup.)
π22π4π€4π1
Since the source code for Android 13 QPR2 was released, let's take a look at some of the under-the-hood and lesser known changes in this threadπ
Twitter
Since the source code for Android 13 QPR2 was released, let's take a look at some of the under-the-hood and lesser known changes in this threadπ
π19π₯3
Android has supported live wallpapers since Android 2.0 Eclair was released in 2009, but finally with Android 14 in 2023, you may be able to set a different live wallpaper for the home and lock screen.
Details here.
Details here.
XDA Developers
Android 14 may let you finally set different live wallpapers for the home and lock screens
Android has supported live wallpapers since 2009, but in 2023, the OS may finally let you set a separate live wallpaper for the lock screen.
π37π€6β€1
Work profile users, heads up: In Android 14, screenshots you take of work apps will now be saved to the work profile instead of your personal profile!
You may see this "Saved in [Files] in the work profile" notice after taking your first screenshot of a work app after updating.
This change is already live in Android 13 QPR3 Beta 1 and Android 14 DP2, but since very few OEMs merge QPRs, most work profile users won't see this change until Android 14.
This change was first implemented in Android 13 QPR2, but it's gated by the SCREENSHOT_WORK_PROFILE_POLICY flag, which is disabled by default in Android 13 QPR2/Android 14 DP1. You can see the code changes for the new work profile screenshot feature here.
As for why screenshots of work profile apps used to be saved in your personal profile's storage, it's because of the way SystemUI (prior to QPR3/DP2) handled screenshots.
Since your personal & work profiles share the same SystemUI instance (profiles are tied to a parent user), SystemUI needed to know which profile the app belonged to when deciding where to save the screen capture. That logic has been implemented in QPR2+.
You may see this "Saved in [Files] in the work profile" notice after taking your first screenshot of a work app after updating.
This change is already live in Android 13 QPR3 Beta 1 and Android 14 DP2, but since very few OEMs merge QPRs, most work profile users won't see this change until Android 14.
This change was first implemented in Android 13 QPR2, but it's gated by the SCREENSHOT_WORK_PROFILE_POLICY flag, which is disabled by default in Android 13 QPR2/Android 14 DP1. You can see the code changes for the new work profile screenshot feature here.
As for why screenshots of work profile apps used to be saved in your personal profile's storage, it's because of the way SystemUI (prior to QPR3/DP2) handled screenshots.
Since your personal & work profiles share the same SystemUI instance (profiles are tied to a parent user), SystemUI needed to know which profile the app belonged to when deciding where to save the screen capture. That logic has been implemented in QPR2+.
π35β€2π1
Android's share sheet is getting a handful of improvements in Android 14. The biggest one is a dedicated row for app-defined actions, which should hopefully convince some apps to ditch their custom share sheets and use the OS's!
More details here.
More details here.
www.esper.io
5 ways Google is making the share sheet better in Android 14
Google is making the share sheet better in Android 14. Here are the top 5 improvements to the sharing experience we spotted.
π35β€4
As I mentioned earlier, Android 14 adds a new feature that lets Device Controls providers specify an activity to embed in the Device Controls interface. The latest beta version of the Google Home app (H/T 9to5Google) adds support for this new feature. Attached to this post are some screenshots.
Google Home's custom embedded Device Controls activity is shown whether you access Device Controls through the lock screen shortcut or the Quick Settings tile.
So what does this mean for end users? It means that smart home apps (or really, any app that uses Device Controls, like
JoΓ£o Dias's Tasker) can embed a nicer, more custom UI with more useful actions, instead of being limited to a couple of buttons with a preset UI.
Although the API in question (ControlsProviderService#META_DATA_PANEL_ACTIVITY) is public starting with the Android 14 SDK, it also works on Android 13 QPR3 Beta 1, because the SysUI flag guarding it (USE_APP_PANELS) became a "ReleasedFlag" (ie. defaults to true). Part of the code for it is also in AOSP already since Android 13 QPR2's source drop.
Google Home's custom embedded Device Controls activity is shown whether you access Device Controls through the lock screen shortcut or the Quick Settings tile.
So what does this mean for end users? It means that smart home apps (or really, any app that uses Device Controls, like
JoΓ£o Dias's Tasker) can embed a nicer, more custom UI with more useful actions, instead of being limited to a couple of buttons with a preset UI.
Although the API in question (ControlsProviderService#META_DATA_PANEL_ACTIVITY) is public starting with the Android 14 SDK, it also works on Android 13 QPR3 Beta 1, because the SysUI flag guarding it (USE_APP_PANELS) became a "ReleasedFlag" (ie. defaults to true). Part of the code for it is also in AOSP already since Android 13 QPR2's source drop.
π19
Mishaal's Android News Feed
As I mentioned earlier, Android 14 adds a new feature that lets Device Controls providers specify an activity to embed in the Device Controls interface. The latest beta version of the Google Home app (H/T 9to5Google) adds support for this new feature. Attachedβ¦
In Android 13 QPR3 Beta 1, currently ONLY the Google Home app is allowed to embed a custom activity in SystemUI's Device Controls interface.
This is because only packages declared in the config_controlsPreferredPackages array can declare activities for use as a panel, and this array currently only holds the package name for the Google Home app (com.google.android.apps.chromecast.app) on Pixels' Android 13 QPR3 Beta 1 builds.
A separate SysUI flag (APP_PANELS_ALL_APPS_ALLOWED) overrides this to allow panels from all apps, but this is disabled by default in Android 13 QPR3 Beta 1 but enabled by default in Android 14 DP2 (which makes sense since this is now a public API in Android 14).
This is because only packages declared in the config_controlsPreferredPackages array can declare activities for use as a panel, and this array currently only holds the package name for the Google Home app (com.google.android.apps.chromecast.app) on Pixels' Android 13 QPR3 Beta 1 builds.
A separate SysUI flag (APP_PANELS_ALL_APPS_ALLOWED) overrides this to allow panels from all apps, but this is disabled by default in Android 13 QPR3 Beta 1 but enabled by default in Android 14 DP2 (which makes sense since this is now a public API in Android 14).
π15
CameraX will soon support concurrent camera streams, allowing you to record from two cameras simultaneously at 720p or 1440p resolution.
The Camera2 API for concurrent camera streaming was introduced in Android 11, so it's nice to see support finally extend to CameraX as well!
Google tested CameraX's Concurrent Camera API on the Pixel 6 Pro, Samsung Galaxy S21, and OnePlus 7 Pro. But there are many more models that support concurrent camera streams.
According to the Google Play Console's Device Catalog, 179 device models declare android.hardware.camera.concurrent, the feature indicating that the vendor correctly configured the front and back cameras to support concurrent camera streaming.
(A great read on the concurrent camera streaming API.)
The Camera2 API for concurrent camera streaming was introduced in Android 11, so it's nice to see support finally extend to CameraX as well!
Google tested CameraX's Concurrent Camera API on the Pixel 6 Pro, Samsung Galaxy S21, and OnePlus 7 Pro. But there are many more models that support concurrent camera streams.
According to the Google Play Console's Device Catalog, 179 device models declare android.hardware.camera.concurrent, the feature indicating that the vendor correctly configured the front and back cameras to support concurrent camera streaming.
(A great read on the concurrent camera streaming API.)
droidcon
Can We Use the Front & Back Cameras at the Same Time on Android?
TL;DR: Yes, if the deviceβs software and hardware support it, which canβt be taken for granted. Currently, the only way how to query it is to use the concurrent camera streaming API or look for the FEATURE_CAMERA_CONCURRENT feature on the system.
π22β€3π₯1