Whimsical tech blogger John Graham-Cumming recently wrote about fixing a faulty M.2 SSD using humble workshop tools that are typically used for woodworking endeavors. After a bit of detective work on a crash-prone PC, the blogger discovered one of his Seagate FireCuda 530 (review) SSDs was misbehaving. What managed to get it to work reliably again was strategically focused pressure from a G-clamp plus a carpentry square with a thermal pad.
Graham-Cumming describes the onset of his PC issues as starting with an in-game freeze, followed by a boot to BIOS loop. The blogger deduced that the debilitating PC issues were caused by one of his two Seagate FireCuda 530 SSDs, by trying various combinations of the two SSDs in his two M.2 slots.
It was subsequently discovered that the faulty FireCuda would work (briefly) when the system was started from cold. This observation pointed towards the dodgy drive having a problem that occurred only when its PCB / chips had reached a specific temperature. Confirmation of a thermal issue came from quickly testing the drive after a spell in the deep freeze. It was usable for a good few minutes before it failed this time.
That a thermally affected SSD component caused issues got Graham-Cumming thinking. Sometimes the expansion/contraction of PCB solder joints can cause problems, and it didn’t take long to find out that pressure applied to a particular chip made the FireCuda work ‘reliably.’
Image 1 of 3
(Image credit: John Graham-Cumming )
(Image credit: John Graham-Cumming )
(Image credit: John Graham-Cumming )
Above, some images show how the first working G-clamped SSD was put together. The second slide shows some refinement, with a PC thermal pad and carpentry square added to even the thermals. A thermal camera shot is also in the gallery, showing some portions of the SSD hit about 90 degrees Celsius when used.
With the woodworking tools cooling contraption in place, Graham-Cumming successfully shoveled his user documents to another drive for safekeeping.
Of course, not many people would be happy with this bulky SSD / G-clamp / set square arrangement sitting next to their workstation. The blogger felt the same and updated his post to tell readers he’d been inspired to heat the temperamental SK Hynix DDR4 cache chip with a hot air rework station. The result is that “the SSD works without having woodworking tools applied to it” and is OK to put back in the PC, according to Graham-Cumming.
Some would say the Nvidia GeForce RTX 4060 Ti 16GB is the card that Nvidia and its partners don’t want to see reviewed. No add-in board (AIB) partner would send us a card, and Nvidia didn’t sample anyone… so we bought one, at retail, after searching for over a week to find one in stock. Is it one of the best graphics cards? You can probably already guess the answer to that question.
Based on the same Ada Lovelace architecture and with the same core specs as the RTX 4060 Ti Founders Edition, the sole difference is the use of two 2GB memory chips on each memory channel, doubling the capacity to 16GB. More memory should be good in certain workloads, though the 128-bit memory interface remains and will sometimes hold the GPU back.
More critically, tacking on $100 for the extra VRAM represents yet another cynical move from Nvidia. Yes, some people will be willing to pay the price, but Intel’s Arc A770 comes in both 8GB and 16GB variants (albeit with a 256-bit interface), with about a $50 gap in pricing. Put bluntly, Nvidia charges as much as it feels it can get away with, and sometimes more.
There’s no Founders Edition for the 4060 Ti 16GB, which makes the definition of a “reference” card somewhat nebulous. We figure anything available at the base $499 MSRP qualifies, and after looking at the options available at Newegg, Amazon, and elsewhere, we opted for the Gigabyte RTX 4060 Ti 16GB Gaming OC. Other MSRP models include the MSI Ventus 2X and the Zotac Amp (that’s the Across the Spider-Verse bundle, in case that’s a selling point for you).
Supposedly over 20 other 4060 Ti 16GB variants are available from other AIBs, but most are currently out of stock. We figured the triple fans on the Gigabyte card would provide a better overall cooling solution, so let’s hit the speeds and feeds.
Swipe to scroll horizontally
Nvidia RTX 4060 Ti 16GB Specifications
Graphics Card
Gigabyte RTX 4060 Ti 16GB
RTX 4060 Ti 16GB
RTX 4070
RTX 4060 Ti
RTX 4060
RTX 3060 Ti
RX 6800 XT
RX 6800
RX 6750 XT
Architecture
AD106
AD106
AD104
AD106
AD107
GA104
Navi 21
Navi 21
Navi 22
Process Technology
TSMC 4N
TSMC 4N
TSMC 4N
TSMC 4N
TSMC 4N
Samsung 8N
TSMC N7
TSMC N7
TSMC N7
Transistors (Billion)
22.9
22.9
32
22.9
18.9
17.4
26.8
26.8
17.2
Die size (mm^2)
187.8
187.8
294.5
187.8
158.7
392.5
519
519
336
SMs / CUs / Xe-Cores
34
34
46
34
24
38
72
60
40
GPU Cores (Shaders)
4352
4352
5888
4352
3072
4864
4608
3840
2560
Tensor / AI Cores
136
136
184
136
96
152
N/A
N/A
N/A
Ray Tracing “Cores”
34
34
46
34
24
38
72
60
40
Boost Clock (MHz)
2595
2535
2475
2535
2460
1665
2250
2105
2600
VRAM Speed (Gbps)
18
18
21
18
17
14
16
16
18
VRAM (GB)
16
16
12
8
8
8
16
16
12
VRAM Bus Width
128
128
192
128
128
256
256
256
192
L2 / Infinity Cache
32
32
36
32
24
4
128
128
96
ROPs
48
48
64
48
48
80
128
96
64
TMUs
136
136
184
136
96
152
288
240
160
TFLOPS FP32 (Boost)
22.6
22.1
29.1
22.1
15.1
16.2
20.7
16.2
13.3
TFLOPS FP16 (FP8)
181 (361)
177 (353)
233 (466)
177 (353)
121 (242)
65 (130)
41.4
32.4
26.6
Bandwidth (GBps)
288
288
504
288
272
448
512
512
432
TDP (watts)
160
160
200
160
115
200
300
250
250
Launch Date
Jul 2023
Jul 2023
Apr 2023
May 2023
Jul 2023
Dec 2020
Nov 2020
Nov 2020
May 2022
Launch MSRP
$499
$499
$599
$399
$299
$399
$649
$579
$549
Online Price
$500
$500
$590
$374
$300
$335
$520
$440
$350
The RTX 4060 Ti 16GB has the same specs as the 8GB variant, other than VRAM capacity. The Gigabyte model we’re using for this review gets an extra 60 MHz for its boost clock, which in practice usually won’t matter much — the 4060 Ti Founders Edition averaged just under 2.8 GHz across our test suite, while the Gigabyte card was closer to 2.75 GHz. Paper specs aren’t everything, in other words.
As you can imagine, there’s quite a bit of healthy competition for the 4060 Ti 16GB. AMD’s RX 6800 can now be picked up starting at $450, while the RX 6800 XT has frequently been on sale for $500 over the past couple of months — the cheapest price at the time of writing is $520. Previous generation RTX 3070 and RTX 3070 Ti cards also cost less than the 4060 Ti 16GB now — as they should, considering the overall performance. We’ll also toss in some Intel Arc cards for the benchmarks as well, but we’ll get to those in a few pages.
The bump in memory capacity will definitely help, but raw bandwidth remains a potential problem. If you’re playing games that don’t need or use more than 8GB of VRAM, we’d expect similar performance — with a bit of wiggle room since we’re comparing a factory overclocked card to the reference models. 1440p, and especially 4K, could benefit from the extra VRAM, but Nvidia isn’t marketing the RTX 4060 Ti as a 1440p or 4K gaming solution. That’s probably thanks to its lack of compute and bandwidth, even though the RTX 3060 Ti and RTX 3070 both targeted 1440p.
Note also that the 16GB cards, in the same power envelope, may perform slightly worse than the 8GB models. We definitely saw that in some of our benchmarks. It’s not clear precisely how much power the extra memory uses, but it’s more than zero watts, and that could, in some cases, reduce the maximum boost clocks. Or perhaps it’s just the Gigabyte card in particular, but the difference in favor of the 8GB Founders Edition was generally in the low single-digit percentage points and was basically within the margin of error.
Image 1 of 2
RTX 4060 Ti block diagram (Image credit: Nvidia)
Full AD106 block diagram (Image credit: Nvidia)
Here’s the block diagram for the RTX 4060 Ti, along with the full AD106 chip. Nothing is changed for the RTX 4060 Ti 16GB. There’s one disabled NVDEC (Nvidia Decoder) block and two disabled SMs (Streaming Multiprocessors). Manufacturing would be more complex, as GDDR6 chips need to be mounted on both sides of the PCB. That used to be relatively common, but in recent years such cards are usually professional models or “prosumer” cards like the Titan series.
All the other Ada Lovelace architectural features are present, including the heavily marketed DLSS 3 Frame Generation. If you’re willing to trade latency for a bit more visual smoothness, that’s what it gives you, but the performance charts with DLSS 3 enabled can be rather misleading in our experience. A 50% or larger boost in frames via DLSS 3 doesn’t feel 50% faster — we’d say more like 10–20 percent at best.
Besides gaming, VRAM capacity can also be a factor in AI workloads. Many large language models (LLMs) benefit from lots of memory, and 8GB isn’t enough for even “medium” sized models in many cases. I have to wonder if some of the RTX 4060 Ti 16GB scarcity at launch was from AI researchers and companies grabbing it for experimentation just because of its memory capacity. It still feels like a bit of a throwback to 2021, where GPUs just were sold out at launch, though at least now there are cards priced at MSRP.
Let’s go ahead and move on to the specifics of the Gigabyte RTX 4060 Ti Gaming OC.
Over-ear noise-canceling headphones typically offer the most comprehensive set of features we want for our listening pleasure. The best of these wireless options combine stellar sound quality with powerful active noise cancellation (ANC) and other handy tools to create as complete a package as possible. Of course, some companies do this better than others. For Cunghoctin’s best wireless headphones guide, we tested out a number of different models with a variety of features, including noise cancellation, customization options and sound quality. Plus, our favorites span a range of prices so you can decide how much you’re comfortable spending and, ultimately, get the best buy for you.
Best headphones for 2023
What to look for
When it comes to shopping for headphones, the first thing you’ll need to decide on is wear style. Do you prefer on-ear or over-ear headphones? For the purposes of our buyer’s guide, we focus on the over-ear style as that’s what most noise-canceling headphones are nowadays. Sure, you can find on-ear models with ANC, but over-ear designs are much more effective at blocking sound. Speaking of noise cancellation, you’ll want to determine early on if you even want that. If you frequently listen to music in noisy environments, you’ll want to not only make sure it’s there, but also make sure it’s good. If you plan to use your new headphones in quieter spaces, skipping ANC can save you some money.
The next area to consider is features. We recommend trying to get the most bang for your buck, but as you’re shopping around you should determine which items are must-haves and what you can live without. And don’t take basic things like automatic pausing and Bluetooth multipoint connectivity for granted, as not all companies include them. We also suggest reading reviews to see how well a company’s more advanced features work. This will help you decide if those are something you’re willing to (likely) pay extra for. Pay close attention to battery life estimates and don’t be easily swayed by lofty promises about call quality.
Sound can be subjective, so we recommend trying before you buy if at all possible. We understand this isn’t easy at a time when we’re doing most of our shopping online. But trying on a set of headphones and listening to them for a few minutes can save you from an expensive case of buyer’s remorse. We also recommend paying attention to things like Spatial Audio, Dolby Atmos, 360 Reality Audio and other immersive formats. Not all headphones support them, so you’ll want to make sure a perspective pair does if that sort of thing excites you.
How we test wireless headphones
The primary way we test headphones is to wear them as much as possible. We prefer to do this over a one- to two-week period, but sometimes embargoes don’t allow it. During this time, we listen to a mix of music and podcasts, while also using the earbuds to take both voice and video calls. Since battery life for headphones can be 30 hours or more, we drain the battery with looping music and the volume set at a comfortable level (usually around 75 percent). Due to the longer battery estimates, we’ll typically power the headphones off several times and leave them during a review. This simulates real-world use and keeps us from having to constantly monitor the process for over 24 straight hours.
To judge audio quality, we listen to a range of genres, noting any differences in the sound profile across the styles. We also test at both low and high volumes to check for consistency in the tuning. To assess call quality, we’ll record audio samples with the headphones’ microphones as well as have third parties call us.
When it comes to features, we do a thorough review of companion apps, testing each feature as we work through the software. Any holdovers from previous models are double checked for improvements or regression. If the headphones we’re testing are an updated version of a previous model, we’ll spend time getting reacquainted with the older set. Ditto for the closest competition for each new set of headphones that we review.
Best headphones overall: Sony WH-1000XM5
Sony’s 1000X line has been our top pick for best wireless headphone for a long time now. Until another company can pack in as many high-quality features as Sony, and do so with a mix of excellent sound quality and effective ANC, the crown is safe. With the WH-1000XM5, Sony redesigned its flagship headphones, making them way more comfortable to wear for long periods of time. We also noticed in our tests that the company made noticeable improvements to the active noise cancellation, thanks to a separate V1 chip in addition to the QN1 that was inside the M4. There are now eight total ANC mics as well – the previous model only had four. This all combines to better block background noise and high frequencies, including human voices.
The 1000XM5 still has all of the features that typically make Sony’s top-of-the-line headphones showstoppers. That includes 30-hour battery life and crisp, clear sound with balanced tuning and punchy bass. A combo of touch controls and physical buttons give you on-board access to music, calls and noise modes without reaching for your phone. Speak-to-Chat automatically pauses audio when you begin talking, and like previous Sony headphones, the M5 can change noise modes based on your activity or location. Plus, this model offers better call quality than most of the competition. The only real downside is that they’re $50 more than the WH-1000XM4 at full price ($400).
Noise cancellation: Yes
Multipoint: Yes
Battery life: 30 hours
Weight: 0.55 pounds
Water resistance: None
Read our full review of Sony WH-1000XM5 headphones
Runner up: Bowers and Wilkins Px7 S2
I’ll admit I didn’t expect Bowers & Wilkins to make the year’s best headphones list, or even be in contention for a spot. However, the company’s revised Px7 headphones impressed me during my review. The Px7 S2 are pricey at $399, but Bowers & Wilkins pair impressive sound quality with solid ANC performance. In fact, the Px7 S2 are my favorite headphones right now in terms of sound. There’s also a more refined design that doesn’t look overly plasticky and the headphones fit comfortably even after hours of use. Call quality, ambient sound and automatic pausing aren’t the best here, but they get the job done. At the end of the day, the design, sound quality and excellent noise cancellation make the Px7 S2 a strong pick in the current true wireless headphone field.
Noise cancellation: Yes
Multipoint: Yes
Battery life: 30 hours
Weight: 0.67 pounds
Water resistance: None
Read our Full Review of Bowers & Wilkins Px7 S2 Headphones
Best headphones for those on a budget: Audio-Technica ATH-M20xBT
Audio-Technica has introduced affordable wireless headphones in the past, and while they didn’t offer active noise cancellation, they’re still worth considering. The company’s latest is the M20xBT, a Bluetooth version of the A-T’s popular M20x wired cans. For just $79, you can expect a comfy fit and up to 60 hours of battery life. Bluetooth multipoint connectivity allows you to connect to multiple devices at once and physical buttons provide reliable on-board control. The design isn’t as refined as the company’s pricer models like the M50xBT2, but you get the bulk of what makes Audio-Technica’s cheaper options so good.
Noise cancellation: No
Multipoint: Yes
Battery life: 60 hours
Weight: 1.42 pounds
Water resistance: None
Another solid option: Bose QuietComfort 45
The Bose 700 was one of our top wireless Bluetooth headphones last time around, but the company recently revived a workhorse with the QuietComfort 45. The design is mostly unchanged from the previous QC models, which could be a deal breaker for some. Once you get past that though, the QC45 combines Bose’s excellent active noise cancellation with clear and balanced audio. You can expect up to 24 hours of battery life on a single charge and a comfortable fit that doesn’t get tiresome during long listening sessions. We’ve already seen them on sale for $50 less than full price, which makes the QuietComfort 45 even more compelling.
Noise cancellation: Yes
Multipoint: No
Battery life: 24 hours
Weight: 0.96 pounds
Water resistance: None
Read our Full Review of Bose QuietComfort 45 Headphones
Another solid option: Technics EAH-A800
Back at CES 2022, Panasonic announced the EAH-A800: a new set of active noise canceling headphones under the iconic Technics brand. While most of the features are what you see on any number of wireless headphones, one figure stood out. The company says you can expect up to 50 hours of battery life on the A800, and that’s with active noise cancellation enabled. These are currently in my stable of review units for detailed analysis, but I have already tested them on a long flight. The ANC is impressive and they’re comfortable enough to avoid becoming a burden after several hours. Sound quality is also quite good (there’s LDAC support, too) and there are enough features here to justify the premium price tag.
Noise cancellation: Yes
Multipoint: Yes
Battery life: 50 hours
Weight: 0.65 pounds
Water resistance: None
Another solid option: Master and Dynamic MW75
While Master & Dynamic is known for its design prowess, the company’s over-ear headphones were due for a refresh. With the MW75 that debuted in June, the company opted for a look that takes cues from its MG20 gaming headset and mixes them with a combo of aluminum, leather and tempered glass. The company’s trademark sound quality returns with multiple ANC modes and ambient sound options for a range of situations. At $599, the high-end looks don’t come cheap, but if you’re looking for something beyond the pure plastic fashion of most headphones, M&D has you covered.
Noise cancellation: Yes
Multipoint: Yes
Battery life: 28 hours
Weight: 0.75 pounds
Water resistance: None
Another solid option: Sennheiser Momentum 4
I’ll be honest, I had a hard time choosing between the Px7 S2 and the Momentum 4 wireless headphones for the runner-up spot this time around. However, Bowers & Wilkins gets the edge in terms of design even though the Px7 S2 and the Momentum 4 are very evenly matched on great sound quality. They’re the two best-sounding sets of Bluetooth headphones I’ve tested this year – and it’s not even close. Sennheiser does have an impressive 60-hour battery life in its favor and improved ANC performance. Those two items alone might be enough for you to overlook the very generic design.
Noise cancellation: Yes
Multipoint: No
Battery life: 60 hours
Weight: 0.65 pounds
Water resistance: None
Read our Full Review of Sennheiser Momentum 4 Headphones
FAQs
How can you tell the quality of headphones?
I typically look at three factors: design, sound quality and features. In terms of design, I’m usually looking to see if the build quality of the headphones feels cheap and plasticky. Plenty of companies use plastic, but they can do so in a way that doesn’t look or feel like budget models. For sound quality, I want to hear a nice, even tuning where highs, mids and lows are all well represented. No overly boomy bass or scooped out mids. I also want good clarity where you can pick up fine details and an open, immersive soundstage. Features is typically a distant third, but if a company doesn’t cover basic functionality (automatic pausing, transparency mode, multipoint Bluetooth, etc.) it can be an indication of overall quality.
How do I choose the best quality headphones?
“Best” can be pretty subjective, but I always recommend going to a place where you can listen to the headphones you’re thinking about buying before you commit. Sometimes this isn’t possible, so you’ll want to check return policies. I also recommend doing some research to determine what your priorities are in a new set. Are you an audiophile who wants the best sound quality? Is powerful active noise cancellation (ANC) the most important? Would you rather have conveniences like automatic pausing?
Which brand has the best headphones?
Sony consistently tops our list with its 1000X line. This is mostly due to the combination of sound quality, ANC performance and the truckload of features these headphones pack in. I’ll be the first to tell you that there are better sounding options and other companies, like Bose, offer more effective noise cancellation. But when you add everything up, no one comes close to the full slate of tools Sony puts in its premium headphone line.
Do expensive headphones sound better?
Exorbitant price tags don’t mean better audio quality. Bowers & Wilkins’ headphones are on the high end for wireless noise-canceling models and they sound amazing. However, Audio-Technica’s M50xBT2 is much more affordable and doesn’t have ANC, but these headphones have a warm, natural sound profile that I find very inviting. At the end of the day, it will come down to personal preference, but you don’t need to spend a lot to find great headphones.
More is usually better, and boosting your Steam Deck up to 2TB of fast internal storage is now easier than ever. You may not have heard of Addlink, but they are an established SSD brand and often offer better pricing for the same hardware you find from better-known brands. The S91 is one of those special drives that you can pick up to upgrade your Steam Deck, ROG Ally, or other M.2 2230-requiring system with the right tools and a good guide. The possibility of having this much capacity with NVMe performance on the go, what’s not to like?
The 2TB Addlink S91’s performance is relatively good, particularly when run at the Steam Deck’s PCIe 3.0 speed, and the drive is quite efficient. Addlink backs it with a five-year warranty, although overall support may not be as good as you’ll find with the Sabrent Rocket Q4 2230. That’s okay, as the S91 is currently a bit less expensive. At 1TB, the S91 has stiffer competition, but you can save some money if you’re willing to go with the S91’s QLC flash over a TLC contender. The S91 uses TLC for the 512GB model, but a drive like the Inland TN446 is a better bet.
We think a QLC drive like the Addlink S91 or Rocket Q4 2230 makes sense at 2TB in this form factor. Just be aware that write performance may tank after the initial installation process, and leaving some space free is a good idea to improve performance. At 1TB, it is harder to recommend the S91, but it could be a budget option if money is tight. Let’s see how it stacks up.
Specifications
Swipe to scroll horizontally
Product
512GB
1TB
2TB
Pricing
$52.88
$84.88
$199.88
Form Factor
M.2 2230
M.2 2230
M.2 2230
Interface / Protocol
PCIe 4.0 x4 / NVMe 1.4
PCIe 4.0 x4 / NVMe 1.4
PCIe 4.0 x4 / NVMe 1.4
Controller
Phison E21T
Phison E21T
Phison E21T
DRAM
N/A (HMB)
N/A (HMB)
N/A (HMB)
Memory
Kioxia 112-Layer TLC (BiCS5)
Micron 176-Layer QLC
Micron 176-Layer QLC
Sequential Read
3,500 MB/s
4,900 MB/s
5,000 MB/s
Sequential Write
2,300 MB/s
3,200 MB/s
3,200 MB/s
Random Read
400K
570K
480K
Random Write
550K
750K
750K
Security
N/A
N/A
N/A
Endurance (TBW)
300TB
250TB
450TB
Part Number
ad512GBS91M2P
ad1TBS91M2P
ad2TBS91M2P
Warranty
5-Year
5-Year
5-Year
The Addlink S91 is available in 512GB, 1TB, and 2TB capacities. It performs up to 5,000 / 3,200 MB/s for sequential reads and writes and up to 570K / 750K random read and write IOPS. All capacities are warrantied for five years and can absorb 300TB, 250TB, and 450TB of writes, respectively. Current pricing has decreased during the review’s writing to $52.88, $84.88, and $199.88, respectively.
This pricing is somewhat competitive, but the 512GB Inland TN446 at $49.99 is probably the better deal. At 1TB, the S91 saves you a little money if you want to go with QLC over TLC. At 2TB, it’s one of the less expensive QLC options.
Software and Accessories
Addlink does not provide cloning/imaging software with this SSD, but a downloadable SSD toolbox is available on Addlink’s site. We recommend using free, third-party applications for cloning and imaging.
A Closer Look
Image 1 of 4
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
The Addlink S91 is your typical, DRAM-less M.2 2230 SSD. It has an SSD controller, a single NAND package, and a PMIC. It’s single-sided to ensure it will fit in embedded devices.
Image 1 of 2
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
The flash, labeled ICCVG96AZA, is Micron’s 176-Layer QLC (N48R). This is the same flash on the Sabrent Rocket Q4 2230. Packing 2TB into a single package requires 16 1Tb dies (HDP), which is the typical limit.
The S91 deviates from other existing M.2 2230 SSDs we’ve tested at 512GB. Here it uses Kioxia’s 112-Layer TLC (BiCS5) instead of QLC. Due to the lower-than-expected performance specifications, I had expected either QLC or BiCS5 TLC at 512GB. When I asked, it was confirmed that it uses BiCS5. This is the same flash on the Inland TN436 and WD SN740. It may be slower due to less internal interleaving. Additionally, it may be less efficient due to the lack of CMOS under Array (CuA) and from being of an older flash generation. This does not apply at 1TB or 2TB, where the S91 uses Micron’s QLC. This is why the TN446, which has 176-Layer Micron TLC, is probably the better option at 512GB.
Fisker has shed some more light on its Alaska electric pickup, which it says will have a relatively low base price of $45,400. The Alaska is a work-friendly vehicle, letting you run your business from the cockpit. It has dedicated work glove and cowboy hat storage, a slide-out laptop tray and a cup holder big enough to hold a day’s worth of water.
The default flatbed is 4.5 feet, but you can drop the partition to increase that to 7.5 feet. Lower the seats and the liftgate and you can push it to 9.2 feet, big enough to haul several sheets of plywood from one job to the next. But much as Fisker may promise this will be one of the lightest and cheapest EVs in its class, we’ll wait to see how much it actually costs when it debuts in 2025 before making a judgment.
– Dan Cooper
You can get these reports delivered daily, direct to your inbox. Subscribe right here!
The biggest stories you might have missed
The best wireless earbuds for 2023
The best budget laptops for 2023
Amazon sale slashes Fire TV streaming devices by up to 51 percent
Samsung’s revamped Freestyle projector is now available to pre-order
Call of Duty: Modern Warfare III will include the series’ biggest zombies map ever
Homeworld: Deserts of Kharak will be free on Epic Games Store this month
Alan Wake II, delayed by 10 days, will arrive on October 27
How to take a screenshot on a Windows PC
It’s a Swiss Army-microphone for audio pros on a budget.
Photo by James Trew / Cunghoctin
Professional microphones are as unique as the instruments they’re built to record, each with their own voices. The Sphere LX is a $1,000 modeling microphone designed to alter its qualities to ape the voices of several extremely expensive studio microphones. James Trew explores what it’s like to use this chameleonic device, comparing it to several of the pro microphones it’s trying to impersonate. I may not find the technical intricacies of audio engineering that gripping, but James’ in-depth report is a must-read, even for me.
Continue Reading.
Farewell, old friend.
After nearly two decades of faithful service, the Xbox 360 store will close for good on July 29, 2024. Microsoft’s Movies & TV app will stop working on the same day as the company pulls the last vestiges of support for its console. The company has already promised games compatible with newer consoles will stay on the Xbox One and Series X/S storefronts. And media bought via the Xbox 360 will stay in your library, so you shouldn’t lose too much of anything.
Continue Reading.
It’s got detachable controllers!
Windows Report
With the Legion Go, Lenovo may have its own rival to the Steam Deck, Ayaneo and ASUS’ ROG Ally. A leak, including product renders, suggests it’s a PC gaming handheld equipped with AMD’s new Phoenix processors and a pair of Switch-like detachable controllers. It looks very possible to prop this thing on a table, addressing the issues of hand fatigue so common with other PC-class handhelds. Just a shame it won’t be able to play Tears of the Kingdom.
Continue Reading.
If those range claims are accurate, it’s pretty compelling.
MullenLowe
The Acura ZDX is the latest all-electric vehicle from Honda’s premium brand, due to launch in early 2024. The ZDX boasts CarPlay, Android Auto, a Bang & Olufsen audio setup and an as-yet unofficial range of 325 miles on a single charge. The base model is likely to cost around $60,000, and it’s certainly a pretty-looking way to get around.
Only a few months after they first came out, the Beats Studio Buds + are down to an all-time low price. The company’s latest noise-canceling headphones are 24 percent off at Woot, dropping from $170 to $130. While you shop, it’s important to keep in mind that, though Amazon owns Woot, it has a different return policy.
We gave the Beats Studio Buds + an 84 in our review when they launched. A few of the new features impressed us, but the price increase from $150 to $170 seemed a bit steep for the product — something this deal more than makes up for. Updates rolled out with the Beats Studio Buds + included 16 percent more battery life, three times bigger microphones and acoustic vents added to the front and side. As a whole, the sound quality and noise canceling are both better than its predecessor. Plus, the placement of the headphones’ control button has moved to avoid accidentally pressing it while adjusting their fit (a big problem plaguing the originals).
At the same time, a few things are lacking from the Beats Studio Buds +, such as automatic pausing, wireless charging and a sound that — while improved — doesn’t measure up to competitors like AirPods. But, if you want solid headphones for a decent price, these are certainly a good option. The markdown will be available on Woot for the next four days or until they sell out of their stock.
Follow @CunghoctinDeals on Twitter and subscribe to the Cunghoctin Deals newsletter for the latest tech deals and buying advice.
Even the best graphics cards are prone to crashes. It doesn’t matter if it’s a Nvidia or AMD graphics card. For that same reason, AMD has launched a helpful tool called Radeon GPU Detective (RGD) to aid developers in diagnosing crashes with Radeon graphics cards.
Radeon GPU Detective salvages and analyzes the crash dumps to generate a report to help you troubleshoot the issue. The detailed report provides vital information, including the page fault details, resource details, and even execution markers that indicate the graphics card’s workload before the crash.
Version 1.0 of Radeon GPU Detective can pinpoint graphics card crashes on the Windows operating system, such as TDR (Timeout Detection and Recovery) errors, in Direct3D 12 applications. Apparently, previous APIs, like DirectX 11 or DirectX 9, and other APIs, such as Vulkan, aren’t on the support list. Currently, Radeon GPU Detective only supports the Radeon RX 7000 (RDNA 3) and Radeon RX 6000 (RDNA 2) series. It’s uncertain if AMD will extend compatibility to older Radeon graphics cards. The only requirement to use Radeon GPU Detective is that the system must have the latest Adrenalin 23.7.2 driver installed. Funnily, the Adrenalin 23.7.2 package has its fair share of bugs.
TDR is a Windows feature that resets the graphics card when it doesn’t respond in an established time period. It’s useful because it brings your system back to a usable state without having to force a system restart. If you own a discrete graphics card, you probably have experienced it a few times. When TDR errors occur, you’ll get a warning message how the display driver had stopped working and has recovered or something along those lines. TDR errors are tricky because there’s a long list of causes, everything from a corrupted file to a faulty graphics card. Hopefully, AMD’s new Radeon GPU Detective tool can help developers debug TDR bugs faster than before.
One of the cool features of Radeon GPU Detective is that it allows developers to put the driver into what AMD calls “Crash Analysis” mode via the Radeon Developer Panel (RDP) before replicating the crash. The tool consequently produces an analysis file in text format. Alternatively, you can configure it to output in JSON format for automated processing.
The Radeon GPU Detective tool falls within the Radeon Developer Tool Suite (RDTS) and is available for public download. The code for Radeon GPU Detective is open-sourced, so that you can play with it on the Radeon GPU Detective repository on GitHub.
A new article published in the science journal Nature aims to be the last word for theories regarding LK-99 as a superconductor. Penned by science reporter Dan Garisto, the article is a post-mortem of sorts on the scientific research surrounding LK-99 and the replication efforts that are attempting to separate hype from fact. But Science does as Science does, and different people looking at the same information routinely reach differing (but not necessarily opposite) conclusions.
The article runs through accumulated evidence presented for and against LK-99 being (or not being) the room-temperature, ambient-pressure superconductor to usher humanity into an unrecognizable (and extremely energy-efficient) future. The debate keeps circling around the same issues: the fact that condensed-matter researchers are dealing with quantum effects (of which there’s still racing research and troves of knowledge to be processed into scientific reality) only throws an additional wrench at the already tool-laden, insufficiently-clear recipe posted in the original Korean paper.
The rabbit hole scientists have been following around LK-99 pertains to copper sulfide (Cu2S) impurities. The specificity of the temperature at which the Korean authors detected a tenfold drop in resistivity (from 0.02 ohm-centimeters to 0.002 ohm-cm) seems have been the definitive thread. Prashant Jain, a chemist at the University of Illinois Urbana–Champaign, said that that was the detail that most caught his eye. The thing is that Jain had seen that specific temperature before: it’s the temperature at which copper sulfide (one of the impurities that results from the LK-99 synthesis process) undergoes a phase transition. below the temperature needed for that phase transition to occur, in a way that’s almost identical to the same transition towards superconductivity the original authors attributed to LK-99.
Jianlin Luo, a physicist with the Chinese Academy of Sciences (CAS) and his team performed two experiments that aimed to bring clarity to the prevalence of copper sulfide. The second sample out of those experiments saw its resistivity dive near 112 degrees C (385 Kelvin), which was a match to the Korean team’s observations.
But the documentation penned by the original paper’s authors (led by Lee Suk-bae, the lead author) is only part of the problem: there’s currently no way that scientists currently know of to properly guide the synthesis process in order to increase the number of lead atoms that end up being replaced by copper atoms (note, not copper sulfide) within LK-99 itself (in an extremely simplified manner, that’s the reason the Korean authors attributed to the emergent room-temperature and ambient-pressure superconductivity in their sample). As unclear and disappointing that might be, that’s one of the factors that has to be taken into account when looking into LK-99. It’s the scientific equivalent to the salt we’re used to sprinkle on leaks and unconfirmed reports in our hardware world.
As to the theoretical front, which used simulations to understand whether or not LK-99’s structure was conducive to superconductive behavior, new research from a US-European group too performed precision X-ray imaging of their LK-99 samples. Their observations led them to conclude that despite those initial papers and their promising (if not definitive) outlook, LK-99’s flat bands (through which electrons can zip through losslessly) weren’t conducive to superconductivity after all.
More recently, a team with the Max Planck Institute for Solid State Research in Stuttgart, Germany reported they had synthesized pure, single crystals of LK-99. Using a technique termed “floating zone crystal growth”, the researchers managed to grow LK-99 crystals that were absent of the copper sulfide impurities. The resulting pure LK-99 (with the formula Pb8.8Cu1.2P6O25) showcased behavior in-line with other studies and replication attempts: it behaved like an insulator, not a superconductor. These pure, purple samples too showcased ferromagnetism (expectedly from Fe impurities which weren’t able to be fully eliminated) as well as diamagnetism. That led them to conclude that when separated from impurities, LK-99 isn’t a superconductor; as they wrote in the paper, the data led them to conclude that LK-99 isn’t a superconductor, period.
While the title of the DOI-infused Nature piece unapologetically reads “LK-99 isn’t a superconductor”, the first sentence in the article’s body presents leaves room for the possibility. “Researchers seem to have solved the puzzle of LK-99.” (Emphasis ours.) Nature, apparently, isn’t beyond punchy headlines, but in science, there’s always more studying to be done. The full article is worth a read, if only to go over all of the evidence involved in the saga.
And perhaps that’s for the better. Owing to the gaps in the original paper’s data and the difficulty in replicating LK-99, there are still holdouts in the scientific community that don’t think the LK-99 saga is over yet.
It’s always cool to see Raspberry Pi projects that recycle old hardware, and this project, created by maker and developer Max Van Leeuwen, is a fine example of how you can not only restore a broken device with a single-board computer, but also add new features it couldn’t have had before. He’s used a couple of Raspberry Pis to transform a non-functioning Polaroid camera into one that instantly develops pictures to a remote digital picture frame.
The Pi has turned the old analog camera into a digital camera that works with the Internet of Things. It communicates with the digital picture frame — which is also fitted with a Pi using Wi-Fi. As long as the two devices are connected to the internet, the camera is able to send pictures instantly to the remote frame as soon as they’re taken.
Image 1 of 4
(Image credit: Max Van Leeuwen)
(Image credit: Max Van Leeuwen)
(Image credit: Max Van Leeuwen)
(Image credit: Max Van Leeuwen)
The frame was created to be a gift for Van Leeuwen’s grandmother, so he paid extra attention to ensure the end results were well finished. The frame features an E ink screen so it will retain the last image even if power is lost. This also adds a little bit of an old school effect thanks to the grainy image quality.
The hardware used in this project consists of an old Polaroid camera that’s been fitted with a Raspberry Pi 3A+. The Pi inside has a battery for portability as well as a camera module for capturing images. The frame is also using a 3A+ and features a 7-color eInk panel with a resolution of 600 x 448px.
Van Leeuwen was kind enough to make the project open source so all of the code used in its development is available at GitHub to explore. It’s primarily driven by some custom Python scripts. It covers the image capturing side of the Polaroid camera as well as the “developing” features of the digital frame.
If you want to read more about this Raspberry Pi project, check out the full project breakdown over at Van Leeuwen’s blog. There he details the construction process in greater detail and shows it in action. Be sure to follow him for more cool projects as well as any future updates on this one.
Automatic1111’s Stable Diffusion WebUI now works with Intel GPU hardware, thanks to the integration of Intel’s OpenVINO toolkit that takes AI models and optimizes them to run on Intel hardware. We’ve re-tested the latest release of Stable Diffusion to see how much faster Intel’s GPUs are compared to our previous results, with gains of 40 to 55 percent.
Stable Diffusion (that currently has our previous testing, though we’re working on updating the results) is a deep-learning AI model used to generate images from text descriptions. What makes Stable Diffusion special is its ability to run on local consumer hardware. The AI community has plenty of projects out there, with Stable Diffusion WebUI being the most popular. It provides a browser interface that’s easy to use and experiment with.
After months of work in the background (we’ve been hearing rumblings of this for a while now), the latest updates are now available for Intel Arc owners and provide a substantial boost to performance.
Check out the Stable Diffusion A1111 webui for Intel Silicon. Works with my A770 or can run on your CPU or iGPU. It’s powered by OpenVINO, so its optimized.😃Example of image on the right, pure prompting. Left is same image with increase detail on the eyes using InPainting.… pic.twitter.com/zpbQOMvJF3August 17, 2023
See more
Here are the results of our previous and updated testing of Stable Diffusion. We used a slightly tweaked Stable Diffusion OpenVINO for our previous testing, and have retested with the fork of Automatic1111 webui with OpenVINO. We also retested several of AMD’s GPUs with a newer build of Nod.ai’s Shark-based Stable Diffusion. The Nvidia results haven’t been updated, though we’ll look at retesting with the latest version in the near future (and update the main Stable Diffusion benchmarks article when we’re finished).
We should note that we also changed our prompt, which makes the new results generally more demanding. (The new prompt is “messy room,” which tends to have a lot of tiny details in the images that require more effort for the AI to generate.) There’s variation between runs, and there are caveats that apply specifically to Arc right now, but here are the before/after results.
(Image credit: Future)
(Image credit: Future)
The Intel ARC and AMD GPUs all show improved performance, with most delivering significant gains. The Arc A770 16GB (Limited Edition, which is now discontinued) improved by 54%, while the A750 improved by 40% in the same scenario.
Nod.ai hasn’t been sitting still either. AMD’s RX 6800, RX 6750 XT, and RX 6700 10GB are all faster, with the 6800 and 6700 10GB in particular showing large gains. We’re not sure why the 6750 XT didn’t do as well, but the RX 6800 saw a performance boost of 34% and the RX 6700 10GB saw an even greater 76% performance improvement. The RX 6750 XT for some reason only saw a measly 9% increase, even though all three AMD GPUs share the same RDNA2 architecture. (We’ll be retesting other GPUs, including AMD’s newer RX 7000-series parts, in the near future.)
Again, we did not retest the three Nvidia RTX 40-series GPUs, which is why the performance statistics remain identical between the two graphs. Even so, with the new OpenVINO optimizations, Intel’s Arc A750 and A770 are now able to outperform the RTX 4060, and the A770 16GB is close behind the RTX 4060 Ti.
There’s still plenty of ongoing work, including making the installation more straightforward, and fixes so that other image resolutions and Stable Diffusion models work. We had to rely on the “v1-5-pruned-emaonly.safetensors” default model, as the newer “v2-1_512-ema-pruned.safetensors” and “v2-1_768-ema-pruned.safetensors” failed to generate meaningful output. Also, 768×768 generation currently fails on Arc GPUs — we could do up to 720×720, but 744×744 ended up switching to CPU-based generation. We’re told a fix for the 768×768 support should be coming relatively soon, though, so Arc users should keep an eye out for that update.
Downmarket WiFi 6E routers are almost always more expensive than those which only support the WiFi 6 protocol (after all, you’re getting not dual band but tri-band), but you can still find one for less than $200 which offers usable performance.
TP-Link’s Archer AXE75 fits into the budget WiFi 6E router category, costing $193 at press time. For that price, you get a mixed bag: solid 5-GHz throughput, acceptable 2.4-GHz performance and 6-GHz performance that’s no faster than the 5-GHz band. It’s a usable product, but nothing to beam home about.
Design of TP-Link Archer AXE75
The WiFi 6E routers seem to run larger. The TP-Link Archer AXE75 measures 10.7″ × 5.8″ × 1.9″ (not counting its six foldable antennas). It’s roughly the size—and shape—of a small tray.
The top of the router almost completely consists of venting—in a pattern reminiscent of parquet. The exception is a raised glossy strip approximately three inches wide going across almost a third of the device at a diagonal. This bump doesn’t seem to serve much of any purpose—even aesthetically; it appears haphazard.
(Image credit: Tom’s Hardware)
Other aspects of the physical design also appear questionable. There is a USB 3.0 port (which we are pleased by the presence of)—but instead of being on the back of the device with all the other ports and important bits, it is on the side. Indeed, it took us longer than we would like to admit to realize that there even was a USB port on the device.
(Image credit: Tom’s Hardware)
Again, though, there are nice features—even beyond the USB port. The back of the device features an LED power button (a nice touch for those who tire of unnecessary bright lights on electronics), a WPS button, and a WiFi on/off button.
(Image credit: Tom’s Hardware)
Beyond that, the back of the device has four Ethernet ports, a WAN port, and a power button.
As with most routers these days, the TP-Link Archer AXE75 can be set up via a downloadable smartphone app—“Tether”—or a web interface. We could only get the web interface to work over WiFi, oddly enough. Theoretically, we see no reason why an Ethernet setup should fail, but we found that we could not get our Ethernet client connected/recognized until after we had already successfully set up the router.
During the setup process, along with the standard steps of setting login information, the user is prompted to set a two-hour timespan for the router to download and install automatic updates. (We selected 3am to 5am.)
Speaking of which, we were also prompted to update the router’s firmware. During the updating and rebooting process (which lasted a bit longer than we would have thought), the screen popped up repeated “operation failed” notices—which we cannot account for. In the end, however, we were told that the firmware updated successfully.
Features of TP-Link Archer AXE75
We like options when it comes to router features. Unfortunately, the TP-Link Archer AXE75 doesn’t give a lot of choice; many of its features are only accessible via the Tether smartphone app. This includes Alexa integration, QoS, parental controls, network-protection tooling, security and performance scans, and reporting. While we are pleased that these features are offered, we are disappointed for both flexibility and data-privacy reasons that they can’t be accessed via the web.
Even when you download the app, you do not get complete feature access. The router and app only come with basic functionalities pursuant to TP-Link’s “HomeShield Basic” plan. For more features, you’ll have to subscribe to HomeShield Pro for either a monthly or annual fee; in exchange, you’ll get added features like additional parental controls (like enhanced time-limit features), security filters, and DDoS protection.
(Image credit: Tom’s Hardware)
Speaking of mobile devices, like some other routers, the Archer AXE75 is compatible with TP-Link’s OneMesh suite for creating a mesh network.
(Image credit: Tom’s Hardware)
Not all of the TP-Link Archer AXE75’s features are app-only, however. Many other basics remain available, like OFDMA, port forwarding, port triggering, creation of guest networks (which can be encrypted or unencrypted, as you desire), firewalls, VPN setup/management, router-settings backup and recovery, IPTV and multicast, and integration with Apple Time Machine.
There is also some access-control functionality even in the absence of the smartphone app—allowing you to block blacklisted devices or allow only whitelisted devices.
(Image credit: Tom’s Hardware)
To help ensure that you can take better advantage of the 6 GHz band, you can also use the preferred scanning channel (PSC) setting to keep higher-connectivity channels reserved for 6 GHz.
Performance of TP-Link Archer AXE75
The TP-Link Archer AXE75 claims to offer maximums of 574 Mbps on the 2.4 GHz band and 2402 Mbps on each of the 5 GHz and the 6 GHz bands. The fine print on the packaging makes clear, however, that these are but theoretical maximums based on the IEEE Standard 802.11 specifications. We tested accordingly and found largely mediocre speeds.
We conducted tests repeatedly throughout the course of two weekdays in a single-family house with a 1,200-Mbps connection, using a laptop with a RealTek 8852CE network adapter as the client and another PC, attached via Ethernet, as the server to receive traffic. We used Iperf to test throughput and ping to test latency. Four sets of tests were conducted for each band.
Near uncongested: Testing laptop approximately 7 feet away from the router, no substantial traffic being carried across other devices
Far uncongested: Testing laptop approximately 25 feet away from the router, no substantial traffic being carried across other devices
Near congested: Testing laptop approximately 7 feet away from the router; videos streaming on four devices throughout the house
Far congested: Testing laptop approximately 25 feet away from the router; videos streaming on four devices throughout the house
Here are the results we recorded from our testing:
Image 1 of 12
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
We found it extremely curious that, for the most part, traffic-congested performance was usually better than or comparable to uncongested performance under similar circumstances. In the absence of luck or some kind of odd technology that improves the more taxed it is, we suspect it’s possible that this may be a matter of unequal distribution of signal strength. (We estimate a roughly 25 to 30 degree difference in angle between our “near” testing spot and our “far” testing spot.) To this end, it is worth noting the 3% packet loss we experienced in our near-uncongested test on the 6 GHz band.
It should also be noted that mean averages in ping rate were affected, in some cases, by sudden wild swings in latency—sometimes by as much as nearly half a second. This was primarily the case, however, with our “far” tests”—both congested and uncongested.
On the 5-GHz band, which will be your main method of connecting WiFi 6 and WiFi 5 devices, the Archer’s performance was the middle of the pack, pulling in just behind the Tenda RX27 Pro in throughput on both congested and uncongested tests. Latency was solid without congestion but near the bottom of the barrel under congested conditions.
At 6-GHz, the band you can only get on WiFi 6E devices, the Archer AXE75 provided mediocre throughput and latency that was about the same as its 5-GHz speeds. So the best use of this band on this router is to reserve it for your one flagship device, perhaps your gaming PC, to make sure it has no competition for signal.
Except in the far-congested testing, performance on the 2.4 GHz band was decent. At the same time, if you’re shelling out the money for a WiFi 6E router right now, you’re probably less concerned with the 2.4 GHz band.
Bottom Line
For an entry into WiFi 6E, a roughly $200 price tag isn’t bad. But a question lingers: Why would you bother entering—at least, via this particular router?
Throughput performance on every band – 6-GHz, 5-GHz and 2.4-GHz – is solidly middle of the pack. However, latency is pretty high relative to the competition under most scenarios.
Moreover, many of the TP-Link Archer AXE75’s features are inaccessible by web interface—and, in some cases, inaccessible without a paid subscription. Many (we daresay most) of these same features are accessible both for free and via the user’s choice of web or app on competitors.
The Tenda RX27 Pro offers strong performance, unlocked features and costs about $80 less at press time, making it a far better choice overall. MSI’s RadiX has far better performance and features but costs $50 to $60 more. The Archer AXE75 might be worth considering if it was on a massive sale, but even then, there are better budget WiFi 6E routers that would still likely cost less.
Organizations transitioning cold data storage from hard disks to tape could save a lot of money and cut carbon emissions, according to analysis of various bodies of research by the IEEE Spectrum. The report also cites statements and guidance provided by Brad Johns, an IBM data storage business veteran and data storage consultant.
Research by IDC (PDF, 2019) indicates that 60% of all data would be a good candidate for magnetic tape storage – in other words it would usually be classified as cold data. However, only an estimated 15% of all data is kept on tape. Two key advantages of tape, compared to HDD, are its double lifespan and much lower energy consumption.
The need for data storage is increasing rapidly, with organizations and people commonly generating more, and bigger, files every year. However, research indicates that the files stored by governments, academia, and businesses, as well as busy YouTubers, TikTokers, and Instagrammers, are not necessarily stored in the most efficient way.
We like to look in awe at ultra-modern new-fangled tech solutions for geeky thrills, but sometimes the oldies are goodies. A case in point is in storage technology. We have recently covered breakthroughs in DNA storage, Nanofiche, glass, 5D data cubes and others, but HDDs remain, and optical disks keep re-appearing. Now the idea that magnetic tape should be more prevalent is being pushed forward. However, it looks like HDDs aren’t going extinct despite some companies’ wishful thinking, and tape storage is rolling along quite confidently in the 2020s.
(Image credit: Shutterstock)
According to the IEEE report’s sources, HDDs have a working life of about five years, and produce about 2.55 kg of CO2 per terabyte per year. Compare those figures to the same metrics for tape storage: tape is claimed to have a 30 year lifespan, and produce just 0.07 kg of CO2 per terabyte per year.
The lifespan and CO2 figures show a wide gulf, which organizations can use to their advantage, but it doesn’t end there. Device costs and eWaste statistics also help make the case for cold storage tape systems. In the IEEE report an example is put forward where a data center needs to store 100 petabytes of data for 10 years. Using HDDs would result in 7.4 metric tons of eWaste, but tape storage tech could cut that by more than half, to 3.6 metric tons of eWaste.
Some total cost of ownership (TCO) figures are also put forward to help make the argument for cold storage on tape. Johns calculates that in the 100 PB example, the HDD based data center would have a TCO of $17.7 million over the decade. Meanwhile, the tape storage-based facility could reduce that to about $9.5 million. Money talks.
Companies haven’t been blind to the enduring economic appeal of tape, suggests the source report. However, bigger organizations with more resources have been able to transition better to tape, as correctly classifying and sorting your cold data takes “time, money, and effort,” it is observed.
With momentum seemingly behind its adoption for cold storage, and obvious advantages over HDDs for this data storage model, tape is set to retain a compelling advantage “probably for the next decade,” reckons Johns. That is, unless one of the aforementioned exciting storage breakthroughs ever makes it out of R&D labs.
It seems like 2023 is the year gaming companies have decided to get serious about keyboard customizability — physical customizability, that is. The Razer BlackWidow V4 75% may not look like anything out of the ordinary (in fact, it looks almost exactly like a smaller version of the BlackWidow V4 Pro), but this is actually Razer’s first hot-swappable gaming keyboard.
The BlackWidow V4 75% is a wired mechanical gaming keyboard with a compact 75-percent layout and a detachable, padded leatherette wrist rest. For the keyboard enthusiasts it has an aluminum top plate, a hot-swappable PCB, and a gasket-mounted design with two layers of sound-dampening foam, and for the gamers it has N-key rollover, polling rates of up to 8,000 Hz, and bright, customizable per-key RGB (with side underglow).
The BlackWidow V4 75% is available now in black, with Razer’s orange tactile switches, for $189.99. A white version will be released in mid-September, and will cost an extra $10 ($199.99). Razer is also selling standalone 36-packs of its mechanical switches (orange/tactile, green/clicky, yellow/linear) for $24.99 each, but the keyboard will only come with orange switches installed.
Design and Construction of the BlackWidow V4 75%
(Image credit: Tom’s Hardware)
At a glance, the BlackWidow V4 75% looks like a smaller version of the BlackWidow V4 Pro — wrist rest and all. It doesn’t have the V4 Pro’s triple-side underglow, nor does its underglow extend to the wrist rest when connected, but otherwise the new V4 75% shares a very similar overall design, housed in a sturdy black ABS plastic chassis with a matte black aluminum alloy top plate, and machined metal media keys/roller.
Image 1 of 4
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
The BlackWidow V4 75% has a compact 75-percent layout, which is slightly smaller than a TKL layout, with a single column of four navigation keys (versus the TKL’s three-column, two-row cluster of six). While many 75-percent keyboards leave some space between the main keys, arrow keys, and navigation keys, the V4 75% does not — the navigation keys and arrow keys are right next to the main keys. This shaves off a few millimeters from the keyboard’s overall length: the BlackWidow V4 75% measures 12.6 inches (321mm) long, versus Asus ROG Azoth’s length of 12.83 inches (326mm). It’s a bit wider than other keyboards — 6.1 inches (155.5mm), thanks to a slope at the bottom that allows the wrist rest to nestle up against it.
Image 1 of 4
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
It’s sturdily constructed but not overly heavy, weighting 1.8lbs (815g) — not nearly as heavy as the Azoth (2.61lbs / 1186g), but a little heavier than the SteelSeries Apex Pro TKL Wireless (1.65lbs / 747g). On the bottom, you’ll find four rubber non-slip grips and two sets of flip-out feet, which add an extra 6 degrees or 9 degrees of height to the back of the keyboard.
Razer BlackWidow V4 Pro (Image credit: Tom’s Hardware)
The BlackWidow V4 75% comes with Razer’s doubleshot ABS keycaps, which are full-height, lightly textured, and have shine-through primary legends and printed (white) secondary legends. While I had no issues with the keycaps in my testing, I’ve found that Razer’s ABS keycaps tend to start showing wear within the first few months of use.
I’ve been using the BlackWidow V4 Pro intermittently since it launched six months ago, and several of the keycaps are already shiny from wear. While shininess is something that tends to come pretty quickly on ABS keycaps, it seems to come quicker on Razer’s.
Image 1 of 2
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
In the upper right corner, the BlackWidow V4 75% features a volume roller and two media keys, all made of machined metal. Interestingly, the right media key, which has a mute symbol etched into it, can be reprogrammed using Razer’s Synapse 3 software. But the left media key, which has a generic circle etched into it, cannot be reprogrammed — this media key is set to play/pause on a single tap, skip to the next track on a double-tap, and go back to the previous track on a triple-tap.
Image 1 of 3
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
The BlackWidow V4 75% comes with a detachable magnetic leatherette wrist rest, which is padded and has a woven-textured surface. The wrist rest looks like a smaller version of the one that comes with the BlackWidow V4 Pro, but simpler — it doesn’t have a connection point to transfer the keyboard’s underglow, because the V4 75% doesn’t have full underglow (it has underglow on either side, but not along the bottom).
(Image credit: Tom’s Hardware)
Also in the box: a 6.5-foot (2m) detachable USB-C to USB-A cable, and a combination keycap/switch puller.
Specs
Swipe to scroll horizontally
Switches
Razer Orange (Linear)
Lighting
Per-key RGB, underglow (sides)
Onboard Storage
Yes (5 profiles)
Media Keys
Yes, volume roller
Game Mode
Yes
Connectivity
USB-C
Additional Ports
0
Keycaps
Double-shot ABS
Construction
Aluminum top plate
Software
Synapse 3
Dimensions (LxWxH)
12.6 x 6.1 x 0.94 inches / 321 x 155.5 x 24mm
Weight
1.8lbs / 815g
MSRP / Price at Time of Review
$179.99 / $179.99
Release Date
Aug. 17, 2023
Typing and Gaming Experience on the BlackWidow V4 75%
The BlackWidow V4 75% is a wired 75-percent gaming keyboard with N-key rollover, a polling rate of up to 8,000 Hz, and Razer’s third-gen orange tactile mechanical switches. In a bid to appeal to the keyboard enthusiast side of gamers, the V4 75% also features a tape-enhanced hot-swappable PCB (which accepts both 3- and 5-pin switches) and a gasket-mounted design with two layers of dampening foam.
Image 1 of 3
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
The BlackWidow V4 75% comes with Razer’s third-gen orange tactile mechanical switches, which have an actuation force of 50g, an actuation point of 2mm, and a full travel distance of 3.5mm. These switches have a mild tactile bump but are relatively quiet otherwise, and feel similar to Cherry MX Brown switches — albeit slightly heavier, as the MX Browns have an actuation force of 45g. Razer’s switches are rated for 100 million keystrokes and have a box-style cross stem that’s compatible with most third-party keycaps.
Typing on the BlackWidow V4 75% felt and sounded better than expected — especially if you’re coming from a mainstream gaming keyboard (such as any of Razer’s previous keyboards). The tactile switches combined with the gasket-mounted FR4 plate made for a springy-but-quiet typing experience, even if Razer’s orange switches (and all of Razer’s switches) are slightly stiffer than I prefer. The V4 75% and the V4 Pro sound vastly different — and the V4 Pro doesn’t sound terrible, but you can absolutely hear the tape-modding and sound-dampening foam at work in the V4 75%.
(Image credit: Tom’s Hardware)
That said, is it the best-sounding keyboard I’ve used recently? Not exactly. I definitely noticed some stabilizer rattle — not in the spacebar, so much, but in the enter and backspace keys — despite the V4 75%’s factory-lubed, plate-mounted stabilizers. But that’s pretty nit-picky; overall, the V4 75% sounds better than 99% of mainstream gaming keyboards on the market.
As for gaming, the BlackWidow V4 75% performed — as expected — very well, with no latency or lag, though the orange switches’ tactile bump and slightly-heavier actuation force may start to fatigue your fingers if you’re used to smooth, lightweight linear optical switches.
The keyboard has a default polling rate of 1,000 Hz, but you can bump this up to 8,000 Hz in Razer’s Synapse 3 software. This is a wired keyboard so there’s no concern about a higher polling rate eating up battery life, but it does use more processing power and Razer warns that an 8,000 Hz polling rate “may result in reduced frame rate when playing CPU bound games.” I didn’t notice any reduced frame rates in my testing, and I’ve mostly been playing Baldur’s Gate 3, which is CPU-heavy.
Features and Software of the BlackWidow V4 75%
The BlackWidow V4 75% works well out of the box, but can be configured using Razer’s Synapse 3 software. While anyone who knows me knows that I hate Synapse 3, I didn’t hate it quite as much this time — probably because there’s not quite as much to customize on this keyboard.
Image 1 of 2
(Image credit: Tom’s Hardware)
(Image credit: Tom’s Hardware)
You can use Synapse 3 to remap keys (two layers, thanks to Razer’s “HyperShift” duplication tech), adjust the polling rate, and configure the keyboard’s per-key RGB and side underglow, either with preset quick effects or using Razer’s unnecessarily advanced Chroma Studio. As I mentioned earlier, the media keys are sort of programmable — you can program the roller (up/down) and the mute key (single press only), but you cannot program the play/pause key.
It’s a little strange that you can’t at least program the same number of layers for the mute key as exist on the play/pause key, but perhaps Razer will update that in one of its future hourly Synapse 3 updates.
The Bottom Line
The BlackWidow V4 75% is a pretty solid offering for Razer’s first hot-swappable keyboard, but it’s far from perfect. I’m a little disappointed that it looks so… boring, considering it’s Razer’s foray into a new category — the chunky black on black on black just isn’t doing it for me. Also, while it is fairly (physically) customizable once you get your hands on it, it’s disappointing that it’s not being offered with different switch options, or with different keycap options (Razer doesn’t make its PCB keycaps in a 75-percent layout and doesn’t plan to at the moment), and that the white version is more expensive (I know it’s only $10, but still).
But for $189.99, it’s one of the better-sounding gaming keyboards you’ll find. I’m personally a bigger fan of the Asus ROG Strix Scope II 96 Wireless’ (also $179.99) sound, but if you’re looking for something smaller than an almost-full-size keyboard, the BlackWidow V4 75% is a good choice.
MORE: Best Gaming Keyboards
MORE: How to Pick Keycaps for Your Mechanical Keyboard
Capture card manufacturer Magewell has unveiled two new highly-capable 4K capture cards sporting a highly unusual M.2 form factor. The new card’s are known as the Eco Capture HDMI 4K Plus M.2 and Eco Capture 12G SDI 4K Plus M.2. As the names suggest, one model is designed to work with HDMI connectors while the other is designed to work with SDI ports, which is a video connector used by the professional video production industry.
M.2 capture cards are not something you often hear about, but the ultra-compact form factor has many advantages. One of the biggest advantages is integration with newer motherboards that sacrifice most of their smaller PCie x8, x4, and x1 slots for M.2 slots. In these cases, having an M.2 capture card instead of a traditional half-height or full-height PCIe card can be really effective, especially if a system is already using the remaining one or two standard PCIe slots for graphics cards, audio cards, and/or ethernet cards.
Another use is with Mini-ITX motherboards which have even fewer PCIe slots than ATX and micro-ATX motherboards. In cases where the primary and only PCIe x16 slot is being used, M.2 is the only way to add additional PCIe devices to the system. Lots of Mini-ITX boards have more than one M.2 slot as well, which will allow users to build a full Mini-ITX system without sacrificing M.2 storage.
With these capture cards, streamers, video enthusiasts, and professionals can re-route all their video encoding and video processing to the capture card. This offloads work from the CPU and GPU, freeing up resources and improving image quality in some cases (depending on how slow or old the CPU or GPU is). Most people will find the built-in encoders found in the best GPUs and best CPUs to be more than adequate. But, a dedicated capture card can still be beneficial for highly-demanding setups that require more processing power than what a built-in CPU/GPU encoder can provide.
The M.2 card’s themselves come in a 22 x 80 mm form factor, similar to that of 2280-sized M.2 SSDs. Both cards come with a green PCB and feature a large black cooling solution on top, with a very tiny fan actively cooling the encoding chip underneath. Since the M.2 standard does not feature any external ports, the cards need to be used with a special adapter that connects the card to a full-sized HDMI or SDI connector.
According to Magewell, these two new cards offer double the frame rate of their previous versions, featuring 60FPS playback at resolutions of up to 4096×2160 (ie. 4K resolution). Both cards are compatible with Windows and Linux operating systems and support native video APIs like DirectShow, DirectKS, V4L2, and ALSA. Plus, they also support high-quality upscaling, downscaling, cross-scaling, and color space conversion.
Pricing has not been unveiled just yet, but the Eco Capture HDMI 4K Plus M.2 is reportedly now shipping, and the Eco Capture 12G SDI 4K Plus M.2 will be available in the next two months.
Samsung Electronics is poised to employ double-stack architecture for its 9th Generation 3D-NAND memory when it starts its production next year, according to a DigiTimes report that cites Seoul Economic Daily. The sets Samsung apart from SK Hynix, which uses three stacks of NAND to build its 321-layer 3D NAND devices when they enter mass production in the first half of 2025.
Samsung’s 9th Generation V-NAND with over 300-layers is set to hinge on the double-stack technique, which Samsung first embraced in 2020 with its 7th generation 176-layer 3D NAND chips. This method involves production of one 3D NAND stack on a 300 mm wafer and then building another stack on top of the first one. Samsung’s 300-layer 3D NAND will increase storage density produced on one wafer and will enable makers to build lower-cost SSDs or make the best SSDs cheaper.
Contrarily, rival SK Hynix has revealed its intent to kick off the production of its 321-layer 3D NAND in 2025 using a triple-stack approach. This procedure, distinct from Samsung’s, will involve creating three distinct sets of 3D NAND layers, which will increase the number of steps and usage of raw materials, but is meant to maximize yields as it is easier to produce 3D NAND stacks with fewer layers.
Industry speculations that rely on leaked roadmaps suggest that post their 9th Generation 3D NAND, Samsung might adopt a triple-stack methodology for its 10th generation 430-layer 3D NAND. Some experts told Seoul Economic Daily that surpassing 400 layers in 3D NAND would necessitate usage of three separate stacks of 3D NAND, possibly due to yields concerns. Meanwhile, this will naturally increase usage of raw materials and increase costs per 3D NAND wafer.
Samsung’s long-term vision, as presented at the Samsung Tech Day 2022 last October, aspires to reach up to 1,000 layers by 2030.