Google has cancelled its codenamed Agah, Hades, and Herobrine Chromebooks motherboards reference designs with discrete GPUs, which signals that the company may have cancelled its higher-end gaming Chromebooks project in general, reports Ars Technica. Of course, it is possible that the company cancelled these parts to focus on something else, but it does not look like Chromebooks with standalone GPUs are around the corner.
“Herobrine, Hades, and Agah are all cancelled,” a developer wrote in a comment for an AboutChrombeeks post. “The infra (overlays, builders, etc.) have already been shut down for them. Delete.”
Google initiated a project to design gaming-centric Chromebooks back in 2021 in a bid to grab another part of the enthusiast PC market. Although these devices boasted things like customizable keyboards with RGB lighting effects and displays with high refresh rates, their reliance on integrated GPUs predestined their usage primarily for cloud gaming platforms, like Nvidia’s GeForce Now and Microsoft’s Xbox Cloud Gaming.
Excitement grew late in late 2022 when signs of a more locally gaming-oriented Chromebook emerged. Reference motherboard codenamed Hades came with a discrete GeForce RTX 4050 GPU, a chip that is set to power entry-level Windows-based gaming laptops. In addition, Google developed Agah and Herobrine reference designs with standalone Nvidia graphics processors. As a result, Chromebooks based on these boards will not make it to the market. Unless, of course, some prominent Chromebook backers decide to proceed with a Chromebook featuring a discrete GPU themselves.
There undeniably existed a potential market for these gaming Chromebooks. With their widespread use in educational settings, students could have benefited from a game-capable Chromebook running a familiar platform. However, the project would have inherited common gaming laptop peculiarities, such as higher power consumption and shorter battery life, which is crucial for students.
But gaming on ChromeOS is riddled with obstacles. Google had managed to adapt the Linux-based Steam client for Chrome OS. With ChromeOS’s inherent Linux architecture, it can run numerous Windows games through the Proton layer. Yet, there were hurdles. As of November 2022, crucial software like anti-cheat systems for several online games remained incompatible. Other issues, such as performance lags on 2560×1440 and 4K displays and the necessity for storage workarounds for certain game installations, plagued the system.
In fact, the complexities of porting Windows games to Linux and Linux applications to ChromeOS would have made these machines considerably less appealing than traditional Windows laptops for gamers. This, combined with an overall dip in the PC market and the intricacies of integrating Nvidia GPU drivers into ChromeOS, may have influenced Google’s decision to abandon the endeavor.
OLED displays are becoming more commonplace in the world of laptops. They promise richer and more saturated color and deliver inky blacks unmatched by other panel types offered on laptops. So, color me surprised when the Acer Swift Edge 16 made its way into the Tom’s Hardware lab, complete with a 16-inch OLED display with a price tag of less than $1,300.
However, it’s not just the display and price that impress; the Swift Edge 16 comes with a potent Ryzen 7 7840U processor, a comfortable keyboard with a number pad, the rare inclusion of a 1440p webcam along with a thin and lightweight chassis that makes it an easy travel companion when compared to the best ultrabooks and premium laptops.
Acer Swift Edge 16 Specifications
Swipe to scroll horizontally
CPU
AMD Ryzen 7 7840U
Graphics
AMD Radeon 780M Graphics (Integrated)
RAM
16GB LPDDR5-6400 (non-upgradeable)
Storage
1TB NVMe PCIe 4.0 x4 M.2 SSD
Display
16-inch, OLED, 3200 x 2000, 120Hz
Networking
Wi-Fi 6E, Bluetooth 5.1
Ports
2x USB 4 Type-C, 2x USB 3.2 Gen 1 Type-A, 1x HDMI 2.1, 1x 3.5 mm jack, 1x micro SDXC slot
Camera
1440p
Battery
54 WHr
Power Adapter
65W
Operating System
Windows 11 Home
Dimensions (WxDxH)
14.08 x 9.68 x 0.60 inches (357.5 x 245.9 x 15.24 mm)
Weight
2.73 pounds (1.24 kg)
Price (as Configured)
$1,299.99
Design of the Acer Swift Edge 16
“Sleek” and “thin” are the first two words that sprung to mind when I first took the Swift Edge 16 out of the box. The laptop is just 0.6 inches thick with the lid closed, allowing it to slip into a bag easily. That thinness is accompanied by a total weight of 2.73 pounds, which is roughly half a pound lighter than the 15-inch Apple MacBook Air.
However, that difference in weight between the two machines is exacerbated by the materials chosen by Apple and Acer, respectively. The former goes with an aluminum unibody that exudes quality and sturdiness. However, the latter uses a cheap, thin plastic upper chassis and an equally thin aluminum lower chassis cover. As a result, the chassis easily flexes and doesn’t elicit confidence in long-term durability.
Image 1 of 6
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
I lightly pressed on the palm rest directly to the right of the touchpad and watched as the entire keyboard deck dipped down at least an eighth of an inch in response. The sub-3-pound weight is admirable, but it’s immediately apparent what sacrifices were made to achieve this figure.
The right side of the chassis is home to a USB 3.2 Gen 1 Type-A port, headphone jack and a microSD card slot. The opposite side gives you an additional USB 3.2 Gen 1 Type-A port, two USB4 Type-C ports and an HDMI 2.1 port. The Swift Edge 16 charges via one of the two USB-C ports using a 65-watt power adapter. Unlike competing USB-C chargers from Apple and Samsung, Acer’s power brick and USB-C cord are attached instead of being two separate pieces.
A full-size keyboard sits front and center, with a narrow number pad off to the side. A 3.4 x 5.1-inch touchpad sits off-center below the keyboard.
However, the star of the show is undoubtedly the 16-inch OLED display, surrounded by relatively thin bezels all around. This is a 120 Hz unit that is VESA DisplayHDR True Black 500 certified. Like many OLED panels, this one features a glossy coating, which is meant to improve the clarity of the picture. While that may be the case, the glossy finish was a magnet for reflections in my home office. Sitting above the display was another surprise, a 1440p webcam.
The Swift Edge 16 measures 14.08 x 9.68 x 0.60 inches, giving it a slightly larger footprint than the 15-inch MacBook Air. It’s also marginally larger dimensionally than the Lenovo Yoga 9i, a 3.09-pound, 14-inch convertible. The 15-inch Microsoft Surface Laptop 5 measures 13.4 x 9.6 x 0.58 inches and weighs 3.4 pounds.
Acer Swift Edge 16 Productivity Performance
The Swift Edge 16 uses an AMD Ryzen 7 7840U, an octa-core processor with a 3.3 GHz base clock and a 5.1GHz turbo clock. Acer configures the laptop with 16GB of LPDDR5 memory and a 1TB SSD.
Image 1 of 3
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
Starting with the Geekbench synthetic CPU benchmark, the Swift Edge 16 virtually tied with the MacBook Air (M2, 16GB RAM) in the single-core benchmark (1,899). However, in the multi-core test (9,624), the Swift Edge 16 bested the MacBook Air and came in second to the Yoga 9i (Core i7-1360P, 16GB RAM), which scored 9,954.
In our file transfer test, which copies 25GB of files, the Swift Edge 16 hit 1,269.79 MBps. That was good enough for third place behind the MacBook Air (1,342.38 MBps) and the Yoga 9i (1,669.29 MBps). The Surface Laptop 5 (Core i7-1265U, 16GB RAM) wasn’t a contender, delivering just 532.52 MBps during the test.
The Swift Edge 16 took an easy win in our Handbrake test, which involves encoding a 4K video to 1080p. The laptop completed the task in just 7 minutes and 18 seconds, putting it 28 seconds ahead of the MacBook Air. The Yoga 9i was the laggard in this group, requiring 9:45 to finish.
(Image credit: Tom’s Hardware)
In our Cinebench R23 stress test, we perform 20 runs and record the scores. The Swift Edge 16 started the test with a score of 10,575.95, declining significantly from there as heat ramped up. Scores leveled off in the 6,600 to 6,800 range after about seven runs. During the test, the Ryzen 7 7840U chugged along at 2.04GHz and 58.91 degrees Celsius (138.03 degrees Fahrenheit).
Display on the Acer Swift Edge 16
The Swift Edge 16 features an OLED display, which is a welcome inclusion in the sub-$1,500 price category. As the laptop’s name implies, the panel measures 16 inches across and has a 3,000 x 2,000 resolution (3:2 aspect ratio, which we’re used to seeing in Microsoft’s Surface family) and a 120Hz refresh rate. Acer claims that the panel is VESA DisplayHDR True Black 500 certified.
I found the display to be bright enough for typical use, but the glossy coating couldn’t shake the reflections I saw. Not only could I see my reflection staring back at me (particularly apparent with dark content) in my home office, but the effect was only exacerbated outside in the sunlight.
(Image credit: Tom’s Hardware)
With that said, when the lighting was ideal, the rich colors and inky blacks of the OLED panel were a joy to behold. Although I went to see The Super Mario Bros. Movie in the theater, its recent digital release meant that I had to purchase the movie for my kids to watch over (and over, and over again). This was the perfect movie to showcase the Swift Edge 16’s display, with its rich color palette (particularly when Mario visits the Mushroom Kingdom). Likewise, Luigi’s descent into the bowels of the Dark Lands highlighted the excellent contrast.
Our instrumented testing confirmed what my eyeballs were seeing, with the Swift Edge 16 hitting 139.2 percent of DCI-P3 and 200 percent of sRGB (closely matching the performance of the OLED in the Yoga 9i). Brightness was also second only to the MacBook Air, coming in at 387 nits.
Keyboard and Touchpad on the Acer Swift Edge 16
The keyboard on the Swift Edge 16 felt good under my fingers, providing comfortable feedback with quiet operation. The feel of the keys reminds me of a close approximation to that of current-generation MacBook keyboards. Given the productivity-centric purpose of the laptop, there are no RGBs found here, although there is a soft white light that glows beneath the keys (and turns off after about 30 seconds to save power).
(Image credit: Tom’s Hardware)
A narrow number pad, which is an increasingly common feature for 15- and 16-inch class laptops, is available for quick number entries. I appreciate its inclusion for making quick calculations and entering data into spreadsheets.
My go-to for typing tests is keyhero.com, and I managed to type at 85.03 words per minute with 97.91 percent accuracy. For comparison, I averaged 75 wpm and 90.61 percent accuracy on the similarly-sized Samsung Galaxy Book 3 Ultra using the same test, so I much preferred this keyboard.
Audio on the Acer Swift Edge 16
The Swift Edge 16 has two speakers, backed by Acer TrueHarmony technology. Acer claims that TrueHarmony provides more powerful, “headphone-like audio,” but that wasn’t my experience.
The audio was as lifeless as you’d expect with a laptop this thin and light. The speakers weren’t overly loud even with the volume cranked to 100 percent. With that said, they were distortion-free at this volume. I dialed the volume back to about 60 percent and played Guns N’ Roses’ “November Rain.” No matter the volume level, the audio sounded thin and hollow from the downward-firing speakers. OK, maybe it’s a tall order to expect perfection when reproducing audio from one of the greatest rock songs of the 1990s on a $1,300 laptop, especially one that is so svelte, but non-existent bass and no attention to the highs leaves much to be desired.
If you want to use the Swift Edge 16 to handle video calls or watch YouTube videos while doing other tasks, the speakers will suit you just fine. However, grab a pair of good headphones if you want to actually enjoy music.
Upgradeability of the Acer Swift Edge 16
The Swift Edge 16’s internals are accessible by removing six T6 screws on the bottom of the chassis. With the screws removed, I stuck my thumbnail near one of the USB ports and carefully pried off the panel.
Unfortunately, there’s not much to see once inside; the battery is replaceable, as is the M.2 SSD (although there is only one M.2 slot for storage). The Wi-Fi 6E/Bluetooth combo card is also replaceable if you so choose. The LPDDR5 memory is soldered onto the motherboard, so there are no provisions for upgrading your memory from the standard 16GB in the future. This is, unfortunately, increasingly common on thin PCs.
Battery Life on the Acer Swift Edge 16
Among its competitors the Swift Edge 16 wasn’t a standout in our endurance test (web browsing, light graphics work and video streaming while connected to Wi-Fi with Swift Edge 16’s display brightness set to 150 nits). It finished at the back of the pack (7:18).
(Image credit: Tom’s Hardware)
The Surface Laptop 15 lasted nearly two hours longer, while the Yoga 7i gave up after 10 hours and 32 minutes. However, the MacBook Air was the endurance champion, lasting almost 15 hours on a charge.
Heat on the Acer Swift Edge 16
We take the skin temperature of ultrabooks while running the Cinebench R23 stress test to see how they deal with heat. During the test, the Swift Edge 16’s fans spun up noticeably to exhaust heat generated by the Ryzen 7 7840U. The aluminum bottom panel of the chassis was warm to the touch but was not uncomfortable to rest on my lap.
I measured 39.9 degrees Celsius (103.83 degrees Fahrenheit) between the G and H keys, while the touchpad was much cooler at 29.4 C (89.42 F). The bottom of the laptop saw the hottest temperatures, reaching 51.5 C (124.7 F) towards the left-rear corner.
Webcam on the Acer Swift Edge 16
With most new laptops coming out with 1080p webcams, it’s becoming harder to notice any significant differences between them. However, Acer is trying to one-up the competition with a 1440p webcam on the Swift Edge 16.
I didn’t expect a dramatic difference in quality compared to the 1080p competition, and my results confirmed that hypothesis. However, the details were incredibly crisp, the colors looked spot-on, and image noise was practically non-existent.
Acer doesn’t offer a physical privacy shutter for the webcam or a hotkey to deactivate it quickly. The webcam also isn’t Windows Hello compatible. You also won’t find other biometric security features like a fingerprint reader.
Software and Warranty on the Acer Swift Edge 16
Acer goes easy on the installed apps, as there isn’t much bloatware here. There’s an Acer product registration app, and the handy Acer Purified Voice Console allows you to configure microphone settings. Acer Care Settings monitors your system vitals (SSD, memory, battery health), performs system updates and can backup/restore your system – it’s a nice one-stop-shop for configuring the laptop. The only other preinstalled app from the factory is the venerable Realtek Audio Console.
Of course, you can expect to find the usual app shortcuts placed by default with each Windows 11 Home install, like Kindle, WhatsApp, and Spotify.
Acer provides a one-year manufacturer warranty with the Swift Edge 16.
Acer Swift Edge 16 Configurations
Our review unit features an AMD Ryzen 7 7840U processor, 16GB of RAM, a 1TB PCIe 4.0 SSD, and a 16-inch 3,200 x 2,000 OLED display. This model is exclusive to Best Buy and priced at $1,299. However, this configuration frequently goes on sale and was recently available at Best Buy for as low as $1,099. There’s also an older SKU with a 3840 x 2400 OLED display, Ryzen 7 7735U processor and a fingerprint reader for $1,299 (although it is often on sale at Costco for $899).
Bottom Line
The Acer Swift Edge 16 has a lot going for it. It’s lightweight and thin for this class, the 16-inch screen is gorgeous, and performance was strong across the board. Even better, the $1,299 price tag ($1,099 if you can grab it on sale) is quite compelling for a large-screen OLED laptop.
On the other hand, the speakers are subpar, the quality of the materials used (particularly the keyboard deck) could be better and battery life ranked at the bottom of the pack. The first issue is easily solved with a pair of headphones, while the second doesn’t feel as premium — or as durable — as other laptops in its class. As for endurance, seven hours of battery life is disappointing, especially compared to the 10-hour showing of the Lenovo Yoga 9i and the nearly 15-hour runtime of the 15-inch Apple MacBook Air.
The Yoga 9i is a fine alternative (if you like the convertible form factor and don’t mind the 14-inch display) due to its similar performance, sturdier construction, and longer battery life. The laptop also regularly goes on sale for $1,149, making it an even more compelling choice. The Swift Edge 16 still has a lot to offer, but perhaps a future update can address our battery life and durability concerns.
Even with proper bed leveling and one of the best 3D printers, sometimes you can experience issues with your 3D prints not sticking well to the 3D printer bed either because of the nature of the design you are printing or the type of material you are using. One of the ways to address this issue is by using a brim.
A brim is an additional thin material you add to your design in the slicer to improve adhesion between the printed object and the 3D printer bed. It achieves this by increasing the contact area between the object and the build plate, providing better stability and preventing warping or detachment during printing. A brim is not intended to be a permanent part of the final 3D printed object. Once you finish 3D printing, you can remove it.
Benefits of 3D Printing Brim
Some of the benefits of a brim in your designs include:
Improved adhesion: As we have mentioned above, the primary purpose of a brim is to enhance adhesion between the print and the build platform, reducing the likelihood of issues resulting from poor adhesion. Also, you will not need adhesion aids like glue sticks or hairspray to help the print attach to the bed.
Stability: For objects with complex geometries and overhangs, adding the brim will act as an anchor, preventing those parts from detaching or becoming misaligned.
Minimized warping: A brim helps counteract the effects of differential cooling, which can cause warping in materials like ABS. The extended surface area improves bonding, reducing the risk of warping.
Adaptability to print speeds: A brim is particularly valuable when printing at higher speeds, as it helps counteract the increased forces that can make the print more prone to lifting or shifting.
Cleaning up the design is easier:Post-processing your 3D prints, especially on the bottom section, is easier as you can peel away or trim the brim easily without affecting the design itself. Also, you can easily clean any remaining traces on the object without affecting the part itself, unlike when it was 3D printed without it.
Not all 3D prints require a brim, but there are specific situations where its use is essential. You need it when:
1. 3D Printing With Materials That Are Prone to Warping
(Image credit: Tom’s Hardware)
Some materials like ABS are prone to warping due to its thermal properties, specifically, the rate at which it cools and contracts. The temperature differences cause the corners of the print to lift, leading to a failed or distorted object, and sometimes the design can completely detach from the bed. If you are using such a material, you should consider using a 3D printing brim, as it provides an extra adhesion around the base of the print.
2. When 3D Printing Objects With Overhangs
(Image credit: Tom’s Hardware)
If your 3D print has overhangs or other delicate parts in the initial layers, you should use a brim, as those parts tend to detach from the build plate. A brim will extend the contact area of these sections and make them stable.
3. 3D Printing Parts With Small Base Areas
(Image credit: Tom’s Hardware)
Parts with small base areas might have issues adhering properly to the build plate as the 3D printer operates and vibrates. So adding a brim can make a big difference as it will increase the contact surface between the print and the platform and distribute adhesion forces more evenly.
So even if forces are applied to the design from the nozzle, especially when it reaches the top areas, it will be difficult for it to detach. You can also adjust the thickness of the brim so that it can be thicker and hold your design firmly.
4. 3D Printing at High Speeds
When printing at higher speeds, the increased forces acting on the design can make it detach from the bed, particularly during the initial layers. Using a 3D printing brim in such scenarios is a valuable strategy as it will provide an additional surface area for adhesion, distributing the forces more evenly and reducing the chances of lifting.
5. Multiple Small Objects
(Image credit: Tom’s Hardware)
When 3D printing multiple small objects, there’s a higher risk of them detaching because of the forces that act on them during the print. A brim provides additional adhesion points for each object, improving stability.
Also, a brim can help ensure uniformity in the first layer of each small object, which is important for achieving consistent quality across all objects in the print. This is particularly beneficial when printing objects that require precise dimensions or alignment.
How to Add a Brim to Your 3D Prints
You can use any 3D printer slicer to add the brim to your 3D prints. If you are using Cura slicer, you can follow the steps below.
1. Import the 3D model you want to print.
2. Navigate to build plate adhesion from the print settings section and choose Brim.
(Image credit: Tom’s Hardware)
3. Adjust brim settings to customize it further to fit your needs. The following settings are available:
Brim minimum width: This determines the minimum length of the filament that will be used when 3D printing the brim. The default value is always 250mm, and it is always enough.
Brim width: Determine how wide the brim should be in millimeters. A wider brim adheres more to the printer bed.
Brim line count: You choose how many lines of the brim will be printed around the base of the model. This setting also affects the width of the brim.
Brim distance: It determines the horizontal distance between the first layer of the print outline and the first brim line. You should always retain it at 0mm.
Brim only on outside: You can enable this option so that the brim can be printed on the outside of the model to make it easier to remove it after 3D printing, as it reduces the amount to be removed.
Brim inside avoid margin: This setting determines the distance of the brim that will be removed from the outer brim that touches the internal of the other part. You should retain the default value, 2.5mm, and see if it works for your design. If not, you can adjust it.
Smart brim: When you enable it, it improves brim removal by swapping the print order of the innermost and second innermost brim lines.
4. Review and slice: Review all your settings, including the brim settings, and ensure everything is okay before exporting your G-code file after slicing.
How to Achieve a Good 3D Printing Brim
Sometimes your brim might have issues like not correctly sticking to the bed, or even it can be challenging to remove from the print because it is either too thick or too thin. You can consider doing the following to solve those issues.
1. Use Appropriate Brim Settings
(Image credit: Tom’s Hardware)
You can adjust the settings highlighted above until you get the most appropriate one for your 3D prints. For example, you can choose a brim width that provides sufficient adhesion while not being excessive.
If there are adhesion issues, you can consider increasing the brim line count to provide more support by printing more lines around the object. However, you should not add too many as it can make the removal process challenging. Additionally, ensure that the smart brim settings are selected, too, as it makes it easier to remove the brim.
2. Properly Level Your 3D Printer Bed
(Image credit: Tom’s Hardware)
Bed leveling is often the first culprit most of us consider when facing any 3D printer issue, as it plays a crucial role in the quality and success of the first layer. If it isn’t level, some areas might be too close to the nozzle, while other areas might be too far away, leading to poor adhesion or even detachment of the brim. So before you start 3D printing, you should properly level your 3D printer bed.
3. 3D Printer Material You Use
Different materials may require adjustments to brim settings. Materials like ABS are prone to warping and need a wider brim or increased line count for it to be printed well. A wider brim involves adding more lines to the edges of the part to help create a stronger, more stable part. For other materials like PLA, you can lower the brim settings.
4. Use the Correct Temperature
If you are using too high or too low temperatures, it can affect your brim. Too high bed temperatures can cause the brim to adhere too firmly to the build surface, making the removal process more challenging and increasing the risk of damaging the main print when trying to detach it.
Also, high temperatures can cause the filament to be heated much, and become fluid, which can result in the deformation of the brim, causing it to lose its intended shape and compromise the accuracy of the first layers.
If the temperature is too low, it can cause insufficient bed temperatures resulting in weak adhesion between the brim and the build surface. So it’s important to choose the optimal 3D printing temperature for your specific filament to achieve the perfect print.
Venerable shareware archiving app WinRAR has recently been patched to fix an alarming flaw. The update to WinRAR version 6.23, spotted by Bleeping Computer, fixes the vulnerability to the high-severity CVE-2023-40477. In brief, earlier versions of WinRAR were vectors for running a program (arbitrary code execution) if an attacker could tempt the user to open a specially crafted RAR file.
If we look at the Zero Day Initiative’s description of the now-patched WinRAR flaw, it explains the following:
The vulnerability allowed remote attackers to execute arbitrary code,
The flaw was due to the program’s handling of recovery volumes,
The flaw stemmed from the application’s improper validation of user-supplied data,
This meant hackers could access memory beyond the end of an allocated buffer for their dastardly deeds, but…
Importantly, a user would have to visit a disguised malicious page or open a file to fall victim to hackers.
(Image credit: Zero Day Initiative)
Security researcher “goodbyeselene” is credited with discovering the WinRAR flaw described in CVE-2023-40477. They reported the vulnerability to WinRAR developers in early June. News of the flaw was published (August 17) several days after version 6.23 had become available for users to download (August 2), so that people had plenty of time to update.
In the WinRAR v6.23 release notes we see CVE-2023-40477 described as “a security issue involving out of bounds write is fixed in RAR4 recovery volumes processing code.” However, it doesn’t look like it was the only vulnerability squashed, as v6.23 also could be steered to “start a wrong file,” after a user double clicked an item in a specially crafted archive.
Is WinRAR Doomed?
Back in May, we covered the news that Windows would be adding native RAR support in a future update – similar in the way to how it currently handles .zip files. This Windows 11 File Explorer enhancement is delivered thanks to the folding-in of open-source project libarchive. With libarchive integration, Windows should be able to (de)compress many more archives like lha, pax, tar, tgz, and 7z formats. Though devs/testers can dabble with native RAR support now, it is only expected to arrive for mass consumption starting from next month.
WinRAR has put a brave face on the fact that Windows 11 is soon to get integrated support for this popular archiving format. Of course, a Windows integrated RAR archive context menu isn’t going to replace a fully featured app like WinRAR and all its archive processing options.
The explosion of AI is further heightening demand for storage performance and capacity as organizations feed models and databases with unprecedented amounts of data, meaning the next generation of storage technologies will need to deliver even greater performance, density and capacity than ever before.
Supermicro’s fourth annual Open Storage Summit brings together leading storage experts from across the industry including drive manufacturers, compute components manufacturers, software developers and of course Supermicro’s industry leading system architects to discuss the latest in storage technologies and how they will solve tomorrow’s data challenges from the data center right out to the intelligent edge.
This year’s Summit includes a roundtable keynote session followed by five focus sessions, with guests from the storage industry’s leading players including Intel®, AMD, NVIDIA, Micron, Kioxia, Solidigm, and Samsung, as well as Supermicro’s storage software partners.
New Innovations For Storage Performance
(Image credit: Supermicro )
Organizations remain increasingly hungry for ever-more dense data storage systems that satisfy their burgeoning data requirements, but they also need to ensure that such systems remain physically and environmentally stable. Ongoing advancements in flash are providing businesses with new menu items that can help them solve critical business challenges while reducing the total cost of ownership (TCO).
During this kickoff session for the 2023 Open Storage Summit, you will discover how technical advances in flash storage, including the introduction of the E3.S form factor and Compute Express Link (CXL), bring a new evolution to what is already a revolutionary technology. In addition, you’ll also learn how Supermicro is incorporating these new technologies into the latest generation of systems as well as some of the challenges encountered along the way.
During this segment, you will:
Be introduced to new features and improvements in E3.S and learn how they can help transform your business
Get a peek at the future capacity roadmap for flash, including new form factors such as E3.S, the path to even higher drive capacities, and what’s in store for subsequent generations of storage
Discover how storage innovations increase performance and lower TCO
Learn about how CXL can breathe new performance life into critical business applications
Register for upcoming webinars
Join the discussion! Register now for full access to the storage industry’s leading online event to get the latest on key storage trends as well as exclusive look into the future of high performance storage from the most influential minds in the industry.
Register now to get the latest on key storage trends and enter for a chance to win a $250 Amazon gift card.
A research team with the University of California, Berkeley, reconstructed Pink Floyd’s iconic song “Another Brick in the Wall, Part 1” purely by decoding a listener’s brainwaves. Led by Dr. Robert Knight et al. and published in the journal PLOS Biology, the feat showcases how good humanity has become at decoding information that should be the last bulwark of privacy. In the future, someone with access to this technology wouldn’t even have to pay the proverbial penny for your thoughts: they’ll just be able to read them as well as one of those NYU ad walls.
The research, which took place between 2012 and 2013, strapped the brains of 29 epilepsy-suffering patients to electrodes (unlike other approaches, these had to be directly connected to the patient’s brain, meaning invasive surgery was required). The electrodes then captured the electrical activity of brain regions specifically responsible for music processing – areas that focus on pattern recognition and processing of tone, rhythm, harmony, and words.
The researchers then played a 3-minute clip from the original song, and the EEG proved to be accurate enough that they could decode the contents of the patients’ brainwaves – and successfully reconstruct the song from the brain activity’s electrical “echoes.” They even got part of the reconstructed song – 22 seconds – in a sound clip.
Brain-Computer Interfaces (BCI) are one of the most promising research areas for the bodily impaired (with specific applications even for the brain-function impaired segment of the population). But any great advancement presents itself as a double-edged sword. As Orwell’s increasingly-relevant 1984 novel shows, Thinkpol (Newspeak for Thought Police) is one area that any authoritarian regime (or person, or company) would love to explore.
There’s also the risk to the fabric of society. Interpersonal relationships happen in the space that lies between the thoughts we have in private (“Jesus, mom, I hate you for making me eat spinach”) and those we choose to release onto the real world (“But mom, I had spinach yesterday!”). But when the frontier between private and public is blurred, it becomes difficult to understand precisely where chips might ultimately fall – and who might be most negatively impacted.
It’s one thing to be able to reconstruct the outside world (and its stimuli) from a person’s brainwaves, but it’s also a typical truth that research tends to trickle down towards the consumer space (let alone the private and governmental branches of society). While it isn’t in the same league as the research covered in this article, even gamers have taken advantage of brain waves to control their in-game characters. Just look at what Twitter user @perrykaryal managed to do on From Software’s Elden Ring with a “simple” electroencephalogram machine (EEG). I couldn’t beat Godrick with my thumbs; she managed to do it by just thinking about it.
The fact that these experiments required physical contact and invasive surgery precludes most of the “bad actor” threats that could emerge from it. But there shouldn’t be any doubt that, given enough time, techniques that don’t require much physical availability will be developed.
I, for one, would love to be able to simply think articles into existence. But questions must be asked whether the benefits of such a technology being generalized outweigh the risks. Then again, most of us are lucky not to have to deal with life-limiting illnesses such as the epilepsy patients that took voluntary part in the study – it’s almost guaranteed they’d have a different outlook on all of this.
The newer Ryzen 5 5600G (Cezanne) has replaced the Ryzen 5 4600G (Renoir) as one of the best CPUs for gaming. However, a trick has breathed new life into the Ryzen 5 4600G, transforming the budget Zen 2 APU into a 16GB graphics card to run AI applications on Linux.
Not everyone has to budget to buy or rent a Nvidia H100 (Hopper) to experiment with AI. With the current demand for AI-focused graphics cards, you may be unable to access one even if you have the money. Luckily, you don’t need an expensive H100, an A100 (Ampere), or one of the best graphics cards for AI. One Redditor demonstrated how a Ryzen 5 4600G retailing for $95 can tackle different AI workloads.
The Ryzen 5 4600G, which came out in 2020, is a hexa-core, 12-thread APU with Zen 2 cores that operate with a base and boost clock of 3.7 GHz and 4.2 GHz. The 65W chip also wields a Radeon Vega iGPU with seven compute units clocked up to 1.9 GHz. Remember that APUs don’t have dedicated memory but share system memory. You can determine the amount of memory inside the motherboard’s BIOS. In this case, the Redditor had 32GB of DDR4 and allocated 16GB to the Ryzen 5 4600G. Typically, 16GB is the maximum amount of memory you can dedicate to the iGPU. However, some user reports claim that certain ASRock AMD motherboards allow for higher memory allocation, rumored up to 64GB.
The trick converts the Ryzen 5 4600G into a 16GB “graphics card,” flaunting more memory than some of Nvidia’s latest GeForce RTX 40-series SKUs, such as the GeForce RTX 4070 or GeForce RTX 4070 Ti, which are limited to 12GB. Logically, the APU doesn’t deliver the same performance as a high-end graphics card, but at least it won’t run out of memory during AI workloads, as 16GB is plenty for non-serious tasks.
AMD’s Radeon Open Compute platform (ROCm) doesn’t officially support Ryzen APUs. Third-party companies, such as BruhnBruhn Holding, offer experimental packages of ROCm that’ll work with APUs. That means APUs can work with PyTorch and TensorFlow frameworks, opening the gate to most AI software. We wonder if AMD’s latest mobile Ryzen chips, like Phoenix that taps into DDR5 memory, can work and what kind of performance they bring.
The Redditor shared a YouTube video claiming that the Ryzen 5 4600G could run a plethora of AI applications, including Stable Diffusion, FastChat, MiniGPT-4, Alpaca-LoRA, Whisper, LLM, and LLaMA. Unfortunately, he only provided demos for Stable Diffusion, an AI image generator based on text input. He doesn’t detail how he got the Ryzen 5 4600G to work with the AI software on his Linux system. The YouTuber has vouched to release a thorough video of the setup process.
As for the performance, the Ryzen 5 4600G only took around one minute and 50 seconds to generate a 512 x 512-pixel image with the default setting of 50 steps. It’s an excellent result for a $95 APU and rivals some high-end processors. The author said he used DDR4 memory but didn’t list the specifications. Although the Ryzen 5 4600G natively supports DDR4-3200, many samples can hit DDR4-4000, so it would be fascinating to see AI performance scaling with faster memory.
The experiment is fantastic for those who own a Ryzen 5 4600G or Ryzen 5 5600G and want to play around with AI. For those who don’t, throwing $500 into an APU build doesn’t make much sense when you can probably get a discrete graphics card that offers better performance. For instance, AMD’s Radeon 16GB graphics cards start at $499, and Nvidia recently launched the GeForce RTX 4060 Ti 16GB, which has a similar starting price.
Prices for gallium are reported to have increased by 50% in just the past seventeen days, following the August 1st enforcement of China’s export restrictions on the rare metal. A vital yet relatively underrepresented material used in chipmaking (especially when reacted towards gallium arsenide [GaA] and gallium nitride [GaN]), gallium production (like that of many other rare earth metals) lays primarily in China’s hands. The country commands as much as 80% of the world’s output, making it a prime candidate for leverage in the global economic tug-of-war between the U.S. and the Middle Kingdom.
As reported by Bloomberg, the price increase means that gallium now sits at a 10-month-high of $400 per kilogram, putting a deeper squeeze on chipmakers and companies that depend on the material (luckily, in trace amounts) for high-performance semiconductor designs. For perspective, high-purity silicon metal (Si) is currently quoted at an average of $2,000 per metric ton – meaning a kilogram runs chipmakers just $2.
Worldwide, gallium refinement facilities are primarily present throughout China and Japan. A lone facility also exists in Europe, but as expected, that’s not nearly enough for the continent’s needs, let alone as an alternative source for the globalized (yet restricted) market. Therefore, non-producing countries must import either the base metal or already-processed gallium arsenide wafers.
Yet due to China’s imposed restrictions, U.S.-based entities looking to import the metal must submit for registration with China’s Ministry of Commerce. While that is a relatively common requirement, the fact that companies could only apply for a license starting August 1st (the same day the export restrictions went into effect) creates a measure of attrition. Add to that the fact that obtaining the license can take up to 45 days and that China announced the restrictions with barely a month to go for companies to adjust (in early July), and the stage was set for the country’s intended disruption.
Due to its relatively exotic usage in electronics and semiconductors, a squeeze on the supply side of gallium disproportionally impacts the U.S.’s chipmaking capabilities. While China may be its home at extraction, the country’s 5-generation gap compared to the U.S. in leading-edge microelectronics means there aren’t many occasions for its chipmaking players to take advantage of its properties. At the same time, the export restrictions mean that China can start stockpiling the rare metal for deployment whenever its chip manufacturing processes are up to the task.
Sabrent has released a refined and expanded update to its first Steam Deck Dock (the DS-SD6P) in the shape of the new DS-SDNV, available now on Amazon. Also described as the ‘Sabrent 7-in-1 Steam Deck Dock with M.2 SSD Slot’, this device packs quite a lot of features into a familiar-looking device stand form-factor.
The Sabrent DS-SDNV isn’t restricted to use with the Steam Deck. It is claimed to be equally happy docking with a ROG Ally or other USB Type-C devices with DisplayPort Alternate (DP Alt) Mode support for output through HDMI 2.0 (HDCP 1.4). Linux, Windows, and macOS are supported.
If you have a compatible device/OS, there is quite a lot of convenient expandability offered here. The following ports are all easily accessible on this USB-C dock:
HDMI 2.0
USB 2.0
Two USB 3.2
USB Type-C PD 3.0
Gigabit Ethernet RJ45
(Image credit: Sabrent)
The slower USB port might be ideal for a mouse/keyboard, while the faster ports can be used for data transfer or other high-speed peripheral usage. Having a wired gigabit network port might also be appealing, with typically improved reliability, speed, and latency compared to wireless connectivity.
Moving on to what might be the Sabrent DS-SDNV’s raison d’être – it can fit an M.2 SSD inside. Under the device’s base, users can remove a cover panel and fit in their choice of M.2 drive. It looks like it can accept any size drive up to 80mm in length (M.2 2280), and Sabrent shows it with a roomy 8TB in place.
(Image credit: Sabrent)
The dock’s M.2 slot is equally happy with an M.2 SATA or NVMe drive. Its quoted maximum speed is 5 Gbps, so it sounds like there would be no speed advantage offered by the NVMe option here. If you need more storage and must keep it portable and as fast as possible, you may be better off replacing the Steam Deck’s internal M.2 2230 SSD.
Finally, Sabrent has some words about its dock’s build quality. It says it is rugged, yet minimalist to blend in with your décor. Additionally, non-slip anti-scratch pads are present on the device’s base, and within its docking cradle.
The Sabrent 7-in-1 Steam Deck Dock with M.2 SSD Slot DS-SDNV is currently listed at $84 on Amazon, with four in stock. That’s a lot more expensive than the first Sabrent Steam Deck Dock was pitched at (with no M.2 slot). However, it is roughly the same price as the official Valve Steam Deck Dock, which we thought was both uninspiring and a poor value.
The chief executive of SMIC faced criticism for proposing to advance chip packaging technologies and multi-chiplet designs after the company lost access to 7nm and 10nm-capable wafer fab tools due to U.S. sanctions. But his vision has now become central to China’s semiconductor approach for 2023, according to DigiTimes. Industry heavyweights like Huawei and government-backed entities with deep pockets are making substantial progress in this domain. Companies like JCET and Tongfu already offer their clients 2.5D and 3D packaging technologies.
Through the National Natural Science Foundation of China (NSFC), the Chinese government is channeling more funds into chiplet research. NSFC’s 2023 research areas encompass advanced 2.5D/3D packaging techniques, reusable chiplet design methods, parallel processing for multiple chiplets, electronic design automation (EDA) tools, and comprehensive multi-chiplet simulations. This intensified focus on chiplet technology showcases China’s strategy to minimize its dependence on foreign semiconductor innovations.
Large companies are also making strides in chiplet design, packaging, and multi-chiplet technologies. For example, Huawei ramped up chiplet-related patent filings from 30 in 2017 to over 900 in 2022. China-based companies recently formed the China Chiplet League to advance the homegrown Chiplet Interconnect Interface Standard.
Chinese companies are by no means novices in advanced packaging. Yet, for now, most of their efforts are tailored to cater to demand from non-Chinese companies with access to Taiwan-based TSMC and ASE Technology Group, the world’s largest outsourced semiconductor assembly and test (OSAT) company.
JCET, the world’s third-largest OSAT, is involved in the chiplet sector. The company can package chiplets made on a 4nm-class process technology, which is nearly on par with what TSMC can offer and is good enough to cater to domestic and international clientele.
Tongfu, another top OSAT, also has developed a set of 2.5D, 3D, and advanced chiplet packaging technologies. Tongfu reportedly indicated that it anticipates continuing benefits from AMD’s broad adoption of chiplet technology in the future. This suggests Tongfu has aligned its technological advancements with industry trends and foresees potential collaborations or synergies with major players like AMD.
NationalChip is collaborating on high-performance interconnect IP designs for chiplet applications. The company is also researching advanced chip design, including high-bandwidth memory (HBM) technology, with a primary focus on creating tailored products for customers.
As for VeriSilicon, it has multiple verification tools for multi-chiplet designs that are reportedly used by companies serving the high-performance computing (HPC) sector.
AMD has encountered some unfortunate issues with mounting pressure on some of its Best GPUs in the past. However, a new report from Igor’s Lab discovered a similar problem with AMD’s new Radeon Pro W7600 single-slot graphics card that is much worse, leading to complete blackouts from the GPU.
The symptoms began when Igor tested AMD’s new single-slot W7600 for a review. He found that testing the card for less than 6 minutes under Lightwave, Horizon Zero Dawn or Furmark would cause the GPU to stop producing an image to the monitor altogether — causing the screen to blackout. This happened even though the card’s PCB, memory, and GPU temperatures were being reported within their specified limits (albeit at the higher end of those limits).
It turns out there was a serious problem with the thermal pads AMD installed on the Radeon Pro W7600. Igor discovered that the thermal pads covering the four GDDR6 memory modules were too thick and too hard, resulting in the single-slot vapor camber tilting on the GPU die and preventing the GPU die from making perfect contact with the vapor chamber cooler.
To make matters worse, AMD did not add additional spacers to the W7600 to counter the mounting pressure from the extra-thick memory pads. The W7600’s four GDDR6 memory modules sit right next to the GPU die, on the top and right sides, forcing all the contact pressure from the pads to sit on one side of the card. Spacers are a common practice in the consumer GPU space to ensure that GPU mounting pressure remains sufficient across all die areas.
AMD also did not provide direct contact from the heatsink to the W7600’s mid-plate sandwiched between the PCB and the cooler. Doing this traps heat in and around the memory modules since they and their associated thermal pads are directly connected to the mid-plate.
You’re supposed to make cutouts for the GDDR6 memory modules so that the thermal pads can directly contact the cooler, bypassing the mid-plate altogether. Or, you connect the mid-plate to the cooler with thermal pads or a combination of thermally conductive metals paired with thermal pads.
Igor’s W7600 Thermal Paste Modification For The Mid-Plate (Image credit: Igor’s Lab)
To fix the issue, all Igor did was replace the GDDR6 memory pads with slightly thinner and softer 0.5mm pads to reduce the right and top-most mounting pressure, apply two additional pads on the opposite sides of the GDDR6 memory modules to stabilize the cooler, and use additional thermal paste connecting the heatsink to the mid-plate. As a result, the mid-plate now has direct contact with the heatsink, allowing all the heat generated from the memory ICs to be transferred directly into the heatsink.
It’s surprising to see such mediocre craftsmanship from AMD, especially on a card designed for the workstation market. Usually, workstation cards are held to higher manufacturing standards than gaming cards due to their purpose. But this isn’t the first time we’ve seen cooling problems like this from AMD. A similar issue was also present on AMD’s reference RX 7900 XTX graphics card, which caused GPU hotspot temps to spike as high as 110C even though the GPU core might be sitting as low as 50-60C.
For now, Igor seems to be the only one with this issue, but we do not doubt that more people will have similar issues down the road as the card ages. Hopefully, AMD will rectify the problem quickly with another revision of the Pro W7600 in the near future.
Intel Arc Alchemist GPUs landed about a year ago, and now compete among the best graphics cards. Drivers have been a recurring theme with Arc, both good and bad, but Intel has made strides to improve things and close the performance gap. To that end, Intel has released new drivers specifically aiming to improve its DirectX 11 performance.
We spoke with Intel earlier this week, and the gains for now are limited to specific games. To provide some background detail, after the initial Arc launch, Intel found that many of the base driver elements for their GPUs — integrated and discrete — were suboptimal, particularly for Arc. With integrated graphics, you can basically always assume that the GPU is the bottleneck. Anything that could be done to move work back to the CPU to alleviate that bottleneck was a “good thing.” Obviously, that same logic doesn’t necessarily apply any longer when the GPU becomes ten times faster, as in the case of the Arc A770.
So, since the initial launch, Intel spent a lot of time and effort reworking the drivers. We saw this first in early 2023 when Intel touted gains in DirectX 9 performance. The initial drivers for Arc leveraged Microsoft’s DX9 on DX12 tools, which provided compatibility and got things up and running. There was a lot of performance left on the table, however. We’re told Intel’s driver team worked with the open source DXVK project (DirectX on Vulkan) to optimize the various elements better — something it couldn’t do with the closed-source Microsoft tool. Eventually, things reached the state that all the DX9 support (or most of it?) was transitioned to using the new DXVK-optimized path.
We’re now getting a similar treatment for DirectX 11 games — not in the DXVK sense, but Intel has worked to build a from the ground-up different code path for DX11 support into its drivers. Since the existing DX11 support already works, even if performance isn’t ideal, the focus has been on testing specific games and then “whitelisting” them in the drivers to use the new code path. At present, Intel has whitelisted ten popular DX11 games: Apex Legends, Counter-Strike 2, Destiny 2, DOTA 2, Genshin Impact, GTA Online / GTA V, League of Legends, Middle-Earth: Shadow of War, Overwatch 2, and Valorant.
In short, the latest drivers optimize how the graphics hardware interacts with the DX11 API. By fine-tuning various parameters and implementing more efficient rendering techniques (it didn’t go into any low-level details), Intel improved overall performance on DirectX 11 games. Here are the results of its internal performance testing, comparing its latest drivers (internal versions 4571 and 4642, though version 4644 drivers are now out) with the original launch drivers (3490).
Image 1 of 5
(Image credit: Intel)
(Image credit: Intel)
(Image credit: Intel)
(Image credit: Intel)
(Image credit: Intel)
Across the ten games (eleven if you want to count GTA Online and GTA V as separate entries), the new DX11 code path boosted performance by anywhere from 5% to 33%. But what’s particularly interesting is that Intel specifically targeted the driver optimizations at “mainstream” CPUs — it tested with a Core i5-13400F and an Arc A750.
Later in the slides, Intel notes that the gains with a top-end Core i9-13900K weren’t as pronounced. The performance increased by only 0% to 28% across the selected games, with an overall average improvement of 12%, compared to the 19% seen with the Core i5 CPU.
We applaud Intel’s efforts and testing, as this makes a lot of sense for potential buyers of the Arc A750. With Arc A750 cards now starting at $199, they’re a great bargain, providing performance roughly equivalent to the RTX 3060 and RX 6700 10GB. Both of those still tend to sell in the $270 range (while supplies last on the 3060), meaning the A750 provides great bang for the buck. At the same time, while we might standardize all of our testing on a Core i9-13900K CPU to eliminate other bottlenecks as much as possible, people building a PC with an A750 aren’t likely to go much above $200 for the CPU, and that’s where the i5-13400F sits.
Image 1 of 6
(Image credit: Intel)
(Image credit: Intel)
(Image credit: Intel)
(Image credit: Intel)
(Image credit: Intel)
(Image credit: Intel)
Intel had some other interesting news and bullet points to discuss. The major one is that PresentMon — the foundation of Nvidia FrameView, AMD OCAT, and CapFrameX — is getting an update. PresentMon was created by Intel as an open-source performance monitoring tool to capture frametimes and other metrics related to graphics performance. It has seen many updates over the years, and now Intel is joining AMD and Nvidia in offering a more robust solution.
The new PresentMon beta adds a GUI, making it far more user-friendly than the existing version. Intel has also added a bunch of new features, including a robust overlay, histograms, and more. There’s also a new “GPU Busy” metric, which shows how much time was spent waiting on the GPU versus waiting on the rest of the system. Minimizing this difference has been a key focus of the new drivers.
Intel continues to work on validating additional games with the new DX11 code path, and they’ll roll those into future drivers. Perhaps at some point, the new code will be deemed robust enough that Intel will simply flip a switch, and all games will default to it — and maybe instead have a blacklist where games known to have issues can stick to the older code.
The new drivers and the PresentMon Beta are available for anyone to try. In fact, the DX11 performance improvements discussed here have been present in the past three Intel WHQL driver releases. The full slide deck for Intel’s Q3’23 quarterly driver update is below for your reference.
TSMC has formed an internal ‘One Team’ task force dedicated to timely development, trial production, and mass production of its N2 (2nm-class) process technology, according to reports from CNA and TorrentBusiness (via@DanNystedt). The ‘One Team’ approach suggests a unified and concerted effort, bringing together expertise and resources to streamline the development and implementation of the 2nm process.
Forming a special task force is somewhat out of character, as TSMC typically launches pilot and then mass production on its latest fabrication technologies at one fab. This time around, according to reports, things will be different. TSMC has plans to commence simultaneous pilot production on its N2 process at two fabs at its Hsinchu site: one in Baochan and another in Kaohsiung. The company intends to start pilot production at both locations in 2024 and initiate volume production in 2025.
TSMC is developing N2 fabrication technology at its special R&D center near Hsinchu. Hence, the fabs are close to the research and development facility, making it easier for fab engineers to communicate with teams that developed the manufacturing process. Meanwhile, TSMC is reportedly transferring 300 employees from its Fab 15A and 400 employees from its Fab 15B near Taichung to its fabs to ‘support the 2nm projects.’
TSMC confirmed the establishment of the ‘N2 One Team’ task force but refrained from disclosing detailed information about its structure and specific projects, according to TorrentBusiness. An intriguing detail here is that the website mentioned projects instead of project when discussing TSMC’s N2. The world’s No.1 foundry is working on multiple 2nm-class projects, including N2 that introduces gate-all-around nanosheet transistors for early 2026, N2P that add backside power delivery in early 2027, and N2X with extended performance due later this decade. Meanwhile, based on the report, N2 is coming to the two fabs at the Hsinchu site.
Intensified competition in the field of leading-edge process technologies is perhaps one of the key reasons why it is making notable shifts in its strategy and resources. On the one hand, Intel Foundry Services is set to offer its 2nm and 1.8nm-class process technologies to its clients about two years ahead of TSMC. In contrast, Samsung Foundry is getting more competitive in general, which is why it may win some production contracts on proven nodes.
Another potential reason for TSMC assembling a special task force to expedite its 2nm endeavors is the heightened complexity of modern nodes. The cadence for introducing new nodes has expanded from two years, as seen with N7 and N5, to approximately three years with N3. Mass production on TSMC’s N2 node is anticipated to commence in late 2025 or early 2026, marking roughly three years since the initiation of high-volume manufacturing for N3.
Obviously, TSMC wants to streamline its N2 development and implementation as much as possible, so forming a special task force makes sense. Yet, launching pilot production and the HVM at two fabs (albeit adjacent) is also a significant challenge.
There’s a good chance you have enough devices with wireless charging support that a single-device pad or stand just won’t cut it. However, buying a multiple-item wireless charger can be a headache. You not only have to contend with varying levels of support, but different designs as well — the last thing you want is something that won’t fit on your nightstand. While this space can be confusing, there are plenty of options out there that are worth your money. We’ll walk you through what you need to know to find the best buy while shopping for a multi-device wireless charging station, and recommend a few models that belong on your short list.
Future-proofing
It won’t be shocking to hear that your smartphone choice influences your choice in a wireless charger. Only iPhone owners will need to consider Apple Watch compatibility. Likewise, you’ll need an Android phone if you expect to power a Galaxy Watch. Buy an iPhone 12 or newer and you can attach your phone magnetically using MagSafe, while the latest Android phones often have some form of fast wireless charging.
However, it’s not simply a question of getting the fastest charger. You should consider what you might buy in the future. Don’t buy a two-device charger if you have an iPhone and AirPods, but have been eyeing an Apple Watch. And if you think you might switch to an Android cell phone (or vice versa), you’ll want to get something more generic that doesn’t lock you into any one ecosystem.
Some chargers include cradles, trays and other features that are heavily optimized for particular products, and might not even account for year-to-year changes. Some vertical stands are too tall for small phones like the iPhone 13 mini, for instance. While you can never completely guarantee that next year’s phone or watch will work, it’s worth buying something more likely to last.
Having said all this, don’t be afraid to get a charger with vendor-specific features if you’re fiercely loyal to one brand. Apple isn’t expected to ditch MagSafe any time soon, and Samsung will likely keep making Galaxy Watches for a while to come.
Where and how will you use it?
Sebastian Bednarek on Unsplash
Even without a charging cable to worry about, you’re probably buying a multi-device wireless charger with one location in mind. It might sit on your nightstand or on your desk. Not everyone buys a charger just for themselves, though; you might want to use one as a shared station for you and a partner.
If the charger will sit on your nightstand, you’ll likely want a compact, stable unit that won’t swallow all your free space or tumble to the floor (and if it does fall, one with enough durability to survive). You may also prefer a lay-flat phone pad so your screen is less likely to keep you awake. The Apple Watch and some other smartwatches can double as tiny alarm clocks, so you might want a vertical charging option for any wristwear.
At a desk, however, you may want a vertical phone stand so you can check notifications. Will the charger sit on a low table? Horizontal charger pads may make it easier to grab your devices in a hurry. Travel chargers should fold up or otherwise protect the pads while they’re in your bag. And, yes, aesthetics count. You may want something pretty if it’s likely to sit in a posh room where guests will see it.
If it’s a shared charging station, you’ll want something with multiple generic surfaces, and you’ll probably have to forgo charging more than one watch at a time. In those cases, consider the handful of 4-in-1 wireless chargers on the market, or models with USB ports.
Performance
It’s no secret that wireless charging is typically slower than wired, and powering multiple devices adds a new wrinkle. As these chargers often have to support a wide range of hardware, you’ll have to forget about the fastest, device-specific options from brands like Google, OnePlus and Samsung.
That’s not to say these will be slow, but there just isn’t much separating them on the charging speed front. As a general rule, the quickest multi-device chargers tend to top out at 15W for phones. And you’ll need an Apple MagSafe charger if you want to get that full 15W on an iPhone.
It’s rare that you’ll find a truly slow example, mind you. Even some of the most affordable options we’ve seen will recharge your phone at a reasonable 7.5W or 10W, and the 5W for other devices is more than enough. If you’re only docking overnight or while you work, speed won’t make a huge difference. Just be sure that whatever you buy is powerful enough for a phone in a case. It’s also worth noting that fast charging for other devices is rare, although you’ll occasionally find speedier options for the Apple Watch Series 7.
Quality, box contents and small details
Melvin Thambi on Unsplash
The difference between a good charger and a great one often boils down to little details. You won’t always need to pay extra to get those, but a larger outlay may be worthwhile to avoid frustrations for years to come.
A textured surface like rubberized plastic or fabric will reduce the chances your expensive gadgets will slide off their charging points. The base should have enough grip and weight that the charger won’t easily shift out of place. Any floating or vertical stands should be sturdy — steer clear if there’s any wobble.
You’ll also want to make a note of what’s included in the box. Some chargers don’t ship with power adapters, and we’ve seen numerous models whose Apple Watch “stands” are merely holders for your existing charging puck.
Then there’s helpful touches like status lights for confirming correct placement, although you’ll want to make sure they dim or shut off after a certain amount of time. And while it’s still true that cradles and trays can limit compatibility, you do want your devices to stay where you put them. Shelves and lips can prevent your phone or watch from sliding. Oh, and be wary of floating smartwatch mounts, as heavier timepieces might sag.
Best premium 3-in-1 charger: Belkin BoostCharge Pro 3-in-1 Wireless Charging Pad with MagSafe
It doesn’t get much better than Belkin’s most advanced model of wireless charger if you’re an Apple devotee. The high-quality BoostCharge Pro 3-in-1 pad offers 15W MagSafe charging for your iPhone, fast charging for the Apple Watch and a space for AirPods Pro or other earbuds with Qi-compatible cases. The base is weighty, covered in rubberized plastic and includes a discreet status light for your earbuds. More importantly, it supports more devices than you might think. Although the Pro pad uses MagSafe, the horizontal layout lets you charge virtually any phone at reduced speeds. We also have to give Belkin kudos for one of the most convenient Apple Watch chargers we’ve seen. It not only works horizontally and vertically, but includes a knob to adjust for different sizes and third-party cases.
This is quite large compared to some 3-in-1 chargers, so it’s not the greatest choice for a nightstand. Consider the smaller footprint of its counterpart, the BoostCharge Pro 3-in-1 Wireless Charger with MagSafe 15W, if you have an iPhone 12 or newer. You also won’t find USB ports, and the indented earbud pad rules out a second phone. Still, it’s easily worth the $150 asking price.
Runner-up 3-in-1: Logitech Powered 3-in-1 Dock
There are many quality high-end chargers to choose from, but Logitech’s Powered 3-in-1 Dock offers a few features that help it rise above. It consumes relatively little space, and the rubberized horizontal and vertical chargers deliver up to 10W while gripping your devices tightly, so both you and your partner can top off.
It has a few limitations, though. The vertical stand isn’t well-suited to the iPhone 13 mini and other small phones. And while the floating stand works with most Apple Watches, heavier ones tend to sag (such as this author’s steel Series 5 with a Leather Link strap) and might not charge properly. If those aren’t issues, though, your $130 will be well-spent.
Best budget 3-in-1 charger: Anker 533 Wireless Charger
You can find plenty of more affordable 3-in-1 chargers. Few, however, offer quite as much for the money as the Anker 533 Wireless Charger 3-in-1 Stand. It offers an adjustable-angle 10W vertical stand and lets you charge an Apple Watch either horizontally or vertically. There’s also a 20W USB-C power adapter in the box, so you won’t have to buy an aftermarket brick (or rely on proprietary cabling) to get started.
The limitations mostly stem from the cost-cutting measures. You probably won’t have room for a second phone. And like some chargers we’ve seen, the Apple Watch mount is a bring-your-own-cable affair that only supports older USB-A connections. The included cable with your Series 8 or SE won’t work here. At $58, though, this 3-in-1 wireless charging stand is a good bargain.
Another good option: Otterbox 3-in-1 Charging Station for MagSafe
If you’re willing to spend a bit more and live in Apple’s universe, the Otterbox 3-in-1 Charging Station for MagSafe is worth your attention. The extremely small footprint of this wireless charger is ideal for nightstands. You can tuck a trio of your phone, earbuds and Apple Watch into an area normally reserved for a single device. The company supplies a surprisingly powerful 36W USB-C power adapter in the box that serves as a fast wired option in a pinch.
The caveats are clear. The floating MagSafe stand rules out Android phones and older iPhones. You’ll need to bring your own Apple Watch cable, and the USB-A port won’t work with the USB-C cables bundled with newer watches. The horizontal-only watch mount also rules out clock functionality. The overall balance of space and utility is still difficult to top for $80.
Best 2-in-1 charger: Mophie Dual Wireless Charging Pad
The 2-in-1 field is highly competitive and makes it difficult to choose an absolute winner. However, Mophie’s Dual Wireless Charging Pad hits many of the right marks. It can charge two devices at up to 10W each, making it a great pick for a two-phone household. The fabric surface with rubberized trim should keep your gadgets steady, and the status lights will confirm accurate placement. There’s even a USB-A port to plug in your watch charger or any other wired hardware.
The complaints are few. You won’t charge at 15W, and we’d rather have USB-C than USB-A. It’s nonetheless a safe choice at $80, and worth buying over less expensive options.
Runner-up 2-in-1: Samsung Super Fast Wireless Charger Duo
Multi-device chargers from phone manufacturers tend to be either compromised or highly proprietary, but Samsung’s Super Fast Wireless Charger Duo (sometimes known as the 15W Duo Fast Wireless Charger) bucks that trend. It’s compact and delivers high-speed charging for one phone and an accessory, whether it’s a Samsung Galaxy Watch or another manufacturer’s earbuds. The status lights will even dim at night, and change color to indicate when your batteries are full.
This won’t help for two-phone households, and Samsung only guarantees 15W charging for some of its own phones (the Galaxy Note 10, Galaxy S20 and later). You’ll also want to be mindful of which version you buy, as there are variants with and without a power adapter in the box. Neither is cheap at respective prices of $90 and $70. This remains an elegant charger for nightstands and travel, though, and the pads are sufficiently device-agnostic.
Best charger for two people: Mophie 4-in-1 Wireless Charging Mat
There are few wireless chargers built with more than one person in mind, but Mophie’s 4-in-1 Wireless Charging Mat is the most well-rounded of the bunch. The pad can handle up to four devices wirelessly at 10W, including two phones and two accessories. There’s also a spare USB-A port for charging earlier Apple Watch models (using the included mount and your own cable) or wired items. A fabric surface, subtle device trays and indicator lights will also take the mysteries out of charging.
This is a giant charger compared to most, and you might find it limiting if your home has more than one Apple Watch or accessories that won’t fit the smaller charging pads. Even so, Mophie is offering considerable value for $150. The 4-in-1 does more than some 3-in-1 chargers at that price, and it doesn’t suffer the compatibility issues of rivals like Nomad’s Base Station Pro.
This week’s best tech deals include the 9th-gen iPad on sale for $250, which ties the lowest price we’ve seen. While the 10.2-inch slate is showing its age design-wise, it’s still a good bargain for those who just need a tablet for the basics and want the most affordable Apple tablet possible. Elsewhere, Sony is still running a rare $50 discount on PlayStation 5, while Amazon’s Fire TV Stick 4K Max is within $2 of its best price to date. We’re also seeing all-time lows on the top picks in our gaming headset and microSD card buying guides, plus Apple’s third-gen AirPods. Here are the best tech deals from this week that you can still get today.
Apple iPad (9th gen)
The 9th-gen Apple iPad is back down to $250 at Amazon, matching its all-time low. You should see the full discount at checkout. Apple sells the 10.2-inch tablet for $329, though we’ve regularly seen it retail closer to $275.
The entry-level slate is certainly getting long in the tooth, as its non-laminated display, thick bezels and Lightning port give it an altogether more dated design than newer iPads. Its 64GB of storage is low, too. At this price, though, the 9th-gen iPad remains one of the better values in the tablet market, with a sturdy aluminum frame, 10 or so hours of battery life and fast-enough performance for casual media consumption. There’s always a chance Apple will introduce new iPads later this year, but if you just want the cheapest route into iPadOS, this model should be enough.
Astro A40 TR
The Astro A40 TR is on sale for $100, which is $30 off its usual street price and ties the lowest price we’ve seen. The A40 TR is the top pick in our guide to the best gaming headsets, as its open-back design gives it a more spacious and enveloping sound that most competitors. It emphasizes the bass, but not to an overwhelming degree, and it’s comfortable to wear to extended periods. That said, the built-in mic is just OK, and like any open-back headphone, the whole thing both leaks and lets in lots of outside noise, so it’s not ideal if you usually play in a noisy room. In general, you can get better value from a pair of “normal” wired headphones than a dedicated gaming headset unless you need a mic. If you really want an all-in-one solution, though, the A40 TR is a worthwhile compromise.
Samsung Pro Plus
The Samsung Pro Plus is the top pick in our microSD card buying guide, and right now its 128GB, 256GB and 512GB models are down to $12, $20 and $35, respectively. Each of those deals match an all-time low. The Pro Plus technically isn’t the fastest microSD card you can buy, but at this price it’s a fantastic value for a Nintendo Switch, GoPro or Android tablet, as it topped all the cards we tested in sequential write speeds and random read/write performance. It also comes with a 10-year limited warranty.
Sony PlayStation 5
The PlayStation 5 is still on sale for $449 at various retailers, which is a $50 discount. We highlighted this deal when Sony kicked off its latest summer sale a couple of weeks ago, but the company says that is scheduled to end on August 19. Discounts for the PS5 have been exceedingly rare since the console arrived in late 2020, so consider this a last-minute PSA. We gave the device a review score of 87 at launch, though it’s become a much better value proposition over the last three years as it’s built out its games library.
Sony PlayStation 5 DualSense Controller
In other PS5 deals, the DualSense wireless controller is still on sale for $49 in various colors. Depending on which model you pick, that’s $20 or $25 off. This matches the lowest outright discount we’ve seen for the gamepad, which is also compatible with Steam. Elsewhere, console covers for the PS5 are down to $45 at the PlayStation Direct store. That’s a $10 discount.
PS5 and PS4 exclusive game sale
A number of PlayStation-exclusive games we recommend are discounted as well, including God of War Ragnarök for $49 and Marvel’s Spider-Man: Miles Moralesfor $20. The thrilling roguelike Returnaland the charming action game Ratchet & Clank: Rift Apartare both down to $29, while the open-world samurai game Ghost of Tsushima: Director’s Cut is available for a buck more. Elsewhere, Death Stranding: Director’s Cutis on sale for $19, while a PS4 copy of Horizon Forbidden West(which includes a free upgrade to the digital PS5 version) is down to $29. We’ve seen all of these deals before, but if you need something new to play, each matches or at least comes within a few dollars of the lowest price we’ve seen.
Anker 622 Magnetic Battery
The Anker 622 Magnetic Battery is back on sale for $40, which isn’t quite an all-time low but still comes in $10 below the device’s typical street price. This portable wireless charger has a slim frame that snaps easily onto the back of a MagSafe-compatible iPhone. It also includes a built-in kickstand for propping your phone up. This deal applies to the “Upgraded Version” of the battery, with a USB-C port on the side; an older variant places that port on the bottom, which is a bit less convenient for pass-through charging. Just note that, like many wireless power packs, the 622 can’t deliver a particularly fast charge (only 7.5W), nor does it have a high capacity (5,000mAh). It can get hot, too. Still, if you want a truly cable-free way to extend an iPhone’s battery on the go, it’s a decent value at this price.
Apple AirPods (3rd gen)
The third-gen Apple AirPods are back down to $140, tying its all-time low. Apple sells the wireless earbuds for $169, though we often see them go for $10 or $20 less elsewhere. This open-back pair has a more balanced sound than most unsealed earbuds, with more bass depth than usual (albeit not a ton). There’s no ANC, as expected, but you still get wireless charging, relatively intuitive touch controls and the usual Apple-friendly features like fast pairing and Find My tracking. Just note that the earpieces are a little large, so they may not fit well with certain ear shapes. This set is also pricey, and like any other open-back pair, it doesn’t isolate much outside noise. Still, if you own an iPhone and hate the feeling of traditional in-ear headphones, it might work. We gave the AirPods a score of 88 in late 2021.
Amazon Fire TV Stick 4K Max
The Amazon Fire TV Stick 4K Max is down to $27, which is $2 more than the lowest price we’ve seen but still roughly $20 below the 4K streamer’s usual street price. This is Amazon’s fastest streaming stick, with support for all the necessary apps and HDR standards, plus Alexa voice controls built into its remote. We generally prefer Roku’s and Google’s respective streaming platforms over Amazon’s Fire OS, as the latter is more aggressive about displaying ads and promoting Amazon’s own content across the UI. But if you just want an affordable device for casual 4K streaming, or if you regularly use Amazon services like Prime Video, this is a fine option.
Amazon Echo Studio
The Amazon Echo Studio is on sale for $160, which is a $40 discount and within $5 of the smart speaker’s all-time low. This is the largest and best-sounding option in Amazon’s Echo lineup. Though we recommend the newer Sonos Era 100 to most people looking for an audio-focused smart speaker, the Echo Studio is still a strong alternative for those who want to save some cash or add a centerpiece to an existing set of Echo devices.
Logitech Litra Glow
The Logitech Litra Glow is back down to $50, which is a deal we’ve seen a few times before but still takes $10 off the device’s usual going rate. The Litra Glow is a USB-powered video light we recommend in our guide to the best game-streaming gear, as we found it to deliver relatively soft and pleasant lighting without harsh edges or shadows. The hardware clips onto the top of a monitor and is easy to rotate or tilt, and you can customize the lighting’s brightness and color temperature through built-in control buttons or Logitech’s companion software. While Logitech markets the device toward content creators, it can also be useful for those who frequently have to take Zoom calls in a room with poor natural lighting.
Instant Pot Duo (3-quart)
If you’ve been thinking about jumping on the Instant Pot bandwagon, the 3-quart Instant Pot Duo is now on sale for $60, or $20 below its typical street price. While that’s not an all-time low, it does match the best price we’ve seen in 2023. We recommend this smaller variant to those who want an electric pressure cooker for individual use or smaller kitchens in our Instant Pot buying guide. It’s one of the more basic options available, but it’s still easy to operate, and it comes with modes for sautéing, slow cooking, steaming and making rice or yogurt, among others.
Samsung Galaxy Z Flip 5
The 256GB Samsung Galaxy Z Flip 5 is down to $900 at Amazon with an on-page coupon, which is a $100 discount for a phone that only went on sale earlier this month. If you shop at Amazon regularly, you can also get the foldable phone with a $150 Amazon gift card, but you’ll have to pay the standard $1,000 MSRP. We gave the Galaxy Z Flip 5 a review score of 88 earlier this month, and we currently list it as the “best foldable for selfies” in our guide to the best smartphones. The big upgrades are a larger 3.4-inch cover display that’s more useful for quickly checking notifications or using apps and a redesigned hinge that lets the device fold flat. You still give up some battery life and camera performance compared to more traditional flagship phones around this price, and like any foldable device, you have to take extra care when handling it. But if the idea of a phone you can fold in half appeals to you, this is the new leader in that market.
Follow @CunghoctinDeals on Twitter and subscribe to the Cunghoctin Deals newsletter for the latest tech deals and buying advice.