Science Journal Says LK-99 Superconductor Dream is Over

A new article published in the science journal Nature aims to be the last word for theories regarding LK-99 as a superconductor. Penned by science reporter Dan Garisto, the article is a post-mortem of sorts on the scientific research surrounding LK-99 and the replication efforts that are attempting to separate hype from fact. But Science does as Science does, and different people looking at the same information routinely reach differing (but not necessarily opposite) conclusions.

The article runs through accumulated evidence presented for and against LK-99 being (or not being) the room-temperature, ambient-pressure superconductor to usher humanity into an unrecognizable (and extremely energy-efficient) future.  The debate keeps circling around the same issues: the fact that condensed-matter researchers are dealing with quantum effects (of which there’s still racing research and troves of knowledge to be processed into scientific reality) only throws an additional wrench at the already tool-laden, insufficiently-clear recipe posted in the original Korean paper. 

The rabbit hole scientists have been following around LK-99 pertains to copper sulfide (Cu2S) impurities. The specificity of the temperature at which the Korean authors detected a tenfold drop in resistivity (from 0.02 ohm-centimeters to 0.002 ohm-cm) seems have been the definitive thread. Prashant Jain, a chemist at the University of Illinois Urbana–Champaign, said that that was the detail that most caught his eye. The thing is that Jain had seen that specific temperature before: it’s the temperature at which copper sulfide (one of the impurities that results from the LK-99 synthesis process) undergoes a phase transition. below the temperature needed for that phase transition to occur, in a way that’s almost identical to the same transition towards superconductivity the original authors attributed to LK-99.

Jianlin Luo, a physicist with the Chinese Academy of Sciences (CAS) and his team performed two experiments that aimed to bring clarity to the prevalence of copper sulfide. The second sample out of those experiments saw its resistivity dive near 112 degrees C (385 Kelvin), which was a match to the Korean team’s observations. 

But the documentation penned by the original paper’s authors (led by Lee Suk-bae, the lead author) is only part of the problem: there’s currently no way that scientists currently know of to properly guide the synthesis process in order to increase the number of lead atoms that end up being replaced by copper atoms (note, not copper sulfide) within LK-99 itself (in an extremely simplified manner, that’s the reason the Korean authors attributed to the emergent room-temperature and ambient-pressure superconductivity in their sample). As unclear and disappointing that might be, that’s one of the factors that has to be taken into account when looking into LK-99. It’s the scientific equivalent to the salt we’re used to sprinkle on leaks and unconfirmed reports in our hardware world.

As to the theoretical front, which used simulations to understand whether or not LK-99’s structure was conducive to superconductive behavior, new research from a US-European group too performed precision X-ray imaging of their LK-99 samples. Their observations led them to conclude that despite those initial papers and their promising (if not definitive) outlook, LK-99’s flat bands (through which electrons can zip through losslessly) weren’t conducive to superconductivity after all.

More recently, a team with the Max Planck Institute for Solid State Research in Stuttgart, Germany reported they had synthesized pure, single crystals of LK-99. Using a technique termed “floating zone crystal growth”, the researchers managed to grow LK-99 crystals that were absent of the copper sulfide impurities. The resulting pure LK-99 (with the formula Pb8.8Cu1.2P6O25) showcased behavior in-line with other studies and replication attempts: it behaved like an insulator, not a superconductor. These pure, purple samples too showcased ferromagnetism (expectedly from Fe impurities which weren’t able to be fully eliminated) as well as diamagnetism. That led them to conclude that when separated from impurities, LK-99 isn’t a superconductor; as they wrote in the paper, the data led them to conclude that LK-99 isn’t a superconductor, period.

While the title of the DOI-infused Nature piece unapologetically reads “LK-99 isn’t a superconductor”, the first sentence in the article’s body presents leaves room for the possibility. “Researchers seem to have solved the puzzle of LK-99.” (Emphasis ours.) Nature, apparently, isn’t beyond punchy headlines, but in science, there’s always more studying to be done. The full article is worth a read, if only to go over all of the evidence involved in the saga.

And perhaps that’s for the better. Owing to the gaps in the original paper’s data and the difficulty in replicating LK-99, there are still holdouts in the scientific community that don’t think the LK-99 saga is over yet.

Raspberry Pi Camera Instantly Develops Pictures into Digital Frame

It’s always cool to see Raspberry Pi projects that recycle old hardware, and this project, created by maker and developer Max Van Leeuwen, is a fine example of how you can not only restore a broken device with a single-board computer, but also add new features it couldn’t have had before. He’s used a couple of Raspberry Pis to transform a non-functioning Polaroid camera into one that instantly develops pictures to a remote digital picture frame.

The Pi has turned the old analog camera into a digital camera that works with the Internet of Things. It communicates with the digital picture frame — which is also fitted with a Pi using Wi-Fi. As long as the two devices are connected to the internet, the camera is able to send pictures instantly to the remote frame as soon as they’re taken.

The frame was created to be a gift for Van Leeuwen’s grandmother, so he paid extra attention to ensure the end results were well finished. The frame features an E ink screen so it will retain the last image even if power is lost. This also adds a little bit of an old school effect thanks to the grainy image quality.

The hardware used in this project consists of an old Polaroid camera that’s been fitted with a Raspberry Pi 3A+. The Pi inside has a battery for portability as well as a camera module for capturing images. The frame is also using a 3A+ and features a 7-color eInk panel with a resolution of 600 x 448px.

Van Leeuwen was kind enough to make the project open source so all of the code used in its development is available at GitHub to explore. It’s primarily driven by some custom Python scripts. It covers the image capturing side of the Polaroid camera as well as the “developing” features of the digital frame.

If you want to read more about this Raspberry Pi project, check out the full project breakdown over at Van Leeuwen’s blog. There he details the construction process in greater detail and shows it in action. Be sure to follow him for more cool projects as well as any future updates on this one.

Stable Diffusion Optimized for Intel Silicon Boosts Arc A770 Performance by 54%

Automatic1111’s Stable Diffusion WebUI now works with Intel GPU hardware, thanks to the integration of Intel’s OpenVINO toolkit that takes AI models and optimizes them to run on Intel hardware. We’ve re-tested the latest release of Stable Diffusion to see how much faster Intel’s GPUs are compared to our previous results, with gains of 40 to 55 percent.

Stable Diffusion (that currently has our previous testing, though we’re working on updating the results) is a deep-learning AI model used to generate images from text descriptions. What makes Stable Diffusion special is its ability to run on local consumer hardware. The AI community has plenty of projects out there, with Stable Diffusion WebUI being the most popular. It provides a browser interface that’s easy to use and experiment with.

After months of work in the background (we’ve been hearing rumblings of this for a while now), the latest updates are now available for Intel Arc owners and provide a substantial boost to performance.

See more

Here are the results of our previous and updated testing of Stable Diffusion. We used a slightly tweaked Stable Diffusion OpenVINO for our previous testing, and have retested with the fork of Automatic1111 webui with OpenVINO. We also retested several of AMD’s GPUs with a newer build of Nod.ai’s Shark-based Stable Diffusion. The Nvidia results haven’t been updated, though we’ll look at retesting with the latest version in the near future (and update the main Stable Diffusion benchmarks article when we’re finished).

We should note that we also changed our prompt, which makes the new results generally more demanding. (The new prompt is “messy room,” which tends to have a lot of tiny details in the images that require more effort for the AI to generate.) There’s variation between runs, and there are caveats that apply specifically to Arc right now, but here are the before/after results.

(Image credit: Future)

(Image credit: Future)

The Intel ARC and AMD GPUs all show improved performance, with most delivering significant gains. The Arc A770 16GB (Limited Edition, which is now discontinued) improved by 54%, while the A750 improved by 40% in the same scenario.

Nod.ai hasn’t been sitting still either. AMD’s RX 6800, RX 6750 XT, and RX 6700 10GB are all faster, with the 6800 and 6700 10GB in particular showing large gains. We’re not sure why the 6750 XT didn’t do as well, but the RX 6800 saw a performance boost of 34% and the RX 6700 10GB saw an even greater 76% performance improvement. The RX 6750 XT for some reason only saw a measly 9% increase, even though all three AMD GPUs share the same RDNA2 architecture. (We’ll be retesting other GPUs, including AMD’s newer RX 7000-series parts, in the near future.)

Again, we did not retest the three Nvidia RTX 40-series GPUs, which is why the performance statistics remain identical between the two graphs. Even so, with the new OpenVINO optimizations, Intel’s Arc A750 and A770 are now able to outperform the RTX 4060, and the A770 16GB is close behind the RTX 4060 Ti.

There’s still plenty of ongoing work, including making the installation more straightforward, and fixes so that other image resolutions and Stable Diffusion models work. We had to rely on the “v1-5-pruned-emaonly.safetensors” default model, as the newer “v2-1_512-ema-pruned.safetensors” and “v2-1_768-ema-pruned.safetensors” failed to generate meaningful output. Also, 768×768 generation currently fails on Arc GPUs — we could do up to 720×720, but 744×744 ended up switching to CPU-based generation. We’re told a fix for the 768×768 support should be coming relatively soon, though, so Arc users should keep an eye out for that update.

TP-Link Archer AXE75 WiFi 6E Router Review: Mediocre Speeds, Phone-Only Features

Downmarket WiFi 6E routers are almost always more expensive than those which only support the WiFi 6 protocol (after all, you’re getting not dual band but tri-band), but you can still find one for less than $200 which offers usable performance.

TP-Link’s Archer AXE75 fits into the budget WiFi 6E router category, costing $193 at press time. For that price, you get a mixed bag: solid 5-GHz throughput, acceptable 2.4-GHz performance and 6-GHz performance that’s no faster than the 5-GHz band. It’s a usable product, but nothing to beam home about.

The WiFi 6E routers seem to run larger. The TP-Link Archer AXE75 measures 10.7″ × 5.8″ × 1.9″ (not counting its six foldable antennas). It’s roughly the size—and shape—of a small tray.

The top of the router almost completely consists of venting—in a pattern reminiscent of parquet. The exception is a raised glossy strip approximately three inches wide going across almost a third of the device at a diagonal. This bump doesn’t seem to serve much of any purpose—even aesthetically; it appears haphazard.

(Image credit: Tom’s Hardware)

Other aspects of the physical design also appear questionable. There is a USB 3.0 port (which we are pleased by the presence of)—but instead of being on the back of the device with all the other ports and important bits, it is on the side. Indeed, it took us longer than we would like to admit to realize that there even was a USB port on the device.

(Image credit: Tom’s Hardware)

Again, though, there are nice features—even beyond the USB port. The back of the device features an LED power button (a nice touch for those who tire of unnecessary bright lights on electronics), a WPS button, and a WiFi on/off button.

(Image credit: Tom’s Hardware)

Beyond that, the back of the device has four Ethernet ports, a WAN port, and a power button.

Swipe to scroll horizontally
Processor 1.7 GHz quad-core Processor Row 0 – Cell 2
Operating Frequency 2.4 GHz / 5 GHz / 6 GHz Row 1 – Cell 2
Data Rates 2.4 GHz: 574 Mbps (802.11ax); 5 GHz: 2402 Mbps (802.11ax); 6 GHz: 2402 Mbps (802.11ax) Row 2 – Cell 2
Ports (4) gigabit Ethernet ports, (1) gigabit WAN port, (1) USB 3.0 port Row 3 – Cell 2
Encryption WPA, WPA2, WPA3, WPA/WPA2-Enterprise (802.1x) Row 4 – Cell 2
Wi-Fi Technology Beamforming, High-Power FEM, OFDMA, Airtime Fairness, DFS Row 5 – Cell 2
Dimensions 10.7″ × 5.8″ × 1.9″ (not counting antennas) Row 6 – Cell 2
Price $179.00+ Row 7 – Cell 2

As with most routers these days, the TP-Link Archer AXE75 can be set up via a downloadable smartphone app—“Tether”—or a web interface. We could only get the web interface to work over WiFi, oddly enough. Theoretically, we see no reason why an Ethernet setup should fail, but we found that we could not get our Ethernet client connected/recognized until after we had already successfully set up the router.

During the setup process, along with the standard steps of setting login information, the user is prompted to set a two-hour timespan for the router to download and install automatic updates. (We selected 3am to 5am.)

Speaking of which, we were also prompted to update the router’s firmware. During the updating and rebooting process (which lasted a bit longer than we would have thought), the screen popped up repeated “operation failed” notices—which we cannot account for. In the end, however, we were told that the firmware updated successfully.

We like options when it comes to router features. Unfortunately, the TP-Link Archer AXE75 doesn’t give a lot of choice; many of its features are only accessible via the Tether smartphone app. This includes Alexa integration, QoS, parental controls, network-protection tooling, security and performance scans, and reporting. While we are pleased that these features are offered, we are disappointed for both flexibility and data-privacy reasons that they can’t be accessed via the web.

Even when you download the app, you do not get complete feature access. The router and app only come with basic functionalities pursuant to TP-Link’s “HomeShield Basic” plan. For more features, you’ll have to subscribe to HomeShield Pro for either a monthly or annual fee; in exchange, you’ll get added features like additional parental controls (like enhanced time-limit features), security filters, and DDoS protection.

(Image credit: Tom’s Hardware)

Speaking of mobile devices, like some other routers, the Archer AXE75 is compatible with TP-Link’s OneMesh suite for creating a mesh network.

(Image credit: Tom’s Hardware)

Not all of the TP-Link Archer AXE75’s features are app-only, however. Many other basics remain available, like OFDMA, port forwarding, port triggering, creation of guest networks (which can be encrypted or unencrypted, as you desire), firewalls, VPN setup/management, router-settings backup and recovery, IPTV and multicast, and integration with Apple Time Machine.

There is also some access-control functionality even in the absence of the smartphone app—allowing you to block blacklisted devices or allow only whitelisted devices.

(Image credit: Tom’s Hardware)

To help ensure that you can take better advantage of the 6 GHz band, you can also use the preferred scanning channel (PSC) setting to keep higher-connectivity channels reserved for 6 GHz.

The TP-Link Archer AXE75 claims to offer maximums of 574 Mbps on the 2.4 GHz band and 2402 Mbps on each of the 5 GHz and the 6 GHz bands. The fine print on the packaging makes clear, however, that these are but theoretical maximums based on the IEEE Standard 802.11 specifications. We tested accordingly and found largely mediocre speeds.

We conducted tests repeatedly throughout the course of two weekdays in a single-family house with a 1,200-Mbps connection, using a laptop with a RealTek 8852CE network adapter as the client and another PC, attached via Ethernet, as the server to receive traffic. We used Iperf to test throughput and ping to test latency. Four sets of tests were conducted for each band.

Near uncongested: Testing laptop approximately 7 feet away from the router, no substantial traffic being carried across other devices

Far uncongested: Testing laptop approximately 25 feet away from the router, no substantial traffic being carried across other devices

Near congested: Testing laptop approximately 7 feet away from the router; videos streaming on four devices throughout the house

Far congested: Testing laptop approximately 25 feet away from the router; videos streaming on four devices throughout the house

Here are the results we recorded from our testing:

We found it extremely curious that, for the most part, traffic-congested performance was usually better than or comparable to uncongested performance under similar circumstances. In the absence of luck or some kind of odd technology that improves the more taxed it is, we suspect it’s possible that this may be a matter of unequal distribution of signal strength. (We estimate a roughly 25 to 30 degree difference in angle between our “near” testing spot and our “far” testing spot.) To this end, it is worth noting the 3% packet loss we experienced in our near-uncongested test on the 6 GHz band.

It should also be noted that mean averages in ping rate were affected, in some cases, by sudden wild swings in latency—sometimes by as much as nearly half a second. This was primarily the case, however, with our “far” tests”—both congested and uncongested.

On the 5-GHz band, which will be your main method of connecting WiFi 6 and WiFi 5 devices, the Archer’s performance was the middle of the pack, pulling in just behind the Tenda RX27 Pro in throughput on both congested and uncongested tests. Latency was solid without congestion but near the bottom of the barrel under congested conditions.

At 6-GHz, the band you can only get on WiFi 6E devices, the Archer AXE75 provided mediocre throughput and latency that was about the same as its 5-GHz speeds. So the best use of this band on this router is to reserve it for your one flagship device, perhaps your gaming PC, to make sure it has no competition for signal.

Except in the far-congested testing, performance on the 2.4 GHz band was decent. At the same time, if you’re shelling out the money for a WiFi 6E router right now, you’re probably less concerned with the 2.4 GHz band.

Bottom Line

For an entry into WiFi 6E, a roughly $200 price tag isn’t bad. But a question lingers: Why would you bother entering—at least, via this particular router?

Throughput performance on every band – 6-GHz, 5-GHz and 2.4-GHz – is solidly middle of the pack. However, latency is pretty high relative to the competition under most scenarios. 

Moreover, many of the TP-Link Archer AXE75’s features are inaccessible by web interface—and, in some cases, inaccessible without a paid subscription. Many (we daresay most) of these same features are accessible both for free and via the user’s choice of web or app on competitors.

The Tenda RX27 Pro offers strong performance, unlocked features and costs about $80 less at press time, making it a far better choice overall. MSI’s RadiX has far better performance and features but costs $50 to $60 more. The Archer AXE75 might be worth considering if it was on a massive sale, but even then, there are better budget WiFi 6E routers that would still likely cost less.

Tape Storage Cheaper and Less Polluting than HDDs, SSDs: Research

Organizations transitioning cold data storage from hard disks to tape could save a lot of money and cut carbon emissions, according to analysis of various bodies of research by the IEEE Spectrum. The report also cites statements and guidance provided by Brad Johns, an IBM data storage business veteran and data storage consultant.

Research by IDC (PDF, 2019) indicates that 60% of all data would be a good candidate for magnetic tape storage – in other words it would usually be classified as cold data. However, only an estimated 15% of all data is kept on tape. Two key advantages of tape, compared to HDD, are its double lifespan and much lower energy consumption.

The need for data storage is increasing rapidly, with organizations and people commonly generating more, and bigger, files every year. However, research indicates that the files stored by governments, academia, and businesses, as well as busy YouTubers, TikTokers, and Instagrammers, are not necessarily stored in the most efficient way.

We like to look in awe at ultra-modern new-fangled tech solutions for geeky thrills, but sometimes the oldies are goodies. A case in point is in storage technology. We have recently covered breakthroughs in DNA storage, Nanofiche, glass, 5D data cubes and others, but HDDs remain, and optical disks keep re-appearing. Now the idea that magnetic tape should be more prevalent is being pushed forward. However, it looks like HDDs aren’t going extinct despite some companies’ wishful thinking, and tape storage is rolling along quite confidently in the 2020s.

(Image credit: Shutterstock)

According to the IEEE report’s sources, HDDs have a working life of about five years, and produce about 2.55 kg of CO2 per terabyte per year. Compare those figures to the same metrics for tape storage: tape is claimed to have a 30 year lifespan, and produce just 0.07 kg of CO2 per terabyte per year.

The lifespan and CO2 figures show a wide gulf, which organizations can use to their advantage, but it doesn’t end there. Device costs and eWaste statistics also help make the case for cold storage tape systems. In the IEEE report an example is put forward where a data center needs to store 100 petabytes of data for 10 years. Using HDDs would result in 7.4 metric tons of eWaste, but tape storage tech could cut that by more than half, to 3.6 metric tons of eWaste.

Some total cost of ownership (TCO) figures are also put forward to help make the argument for cold storage on tape. Johns calculates that in the 100 PB example, the HDD based data center would have a TCO of $17.7 million over the decade. Meanwhile, the tape storage-based facility could reduce that to about $9.5 million. Money talks.

Companies haven’t been blind to the enduring economic appeal of tape, suggests the source report. However, bigger organizations with more resources have been able to transition better to tape, as correctly classifying and sorting your cold data takes “time, money, and effort,” it is observed.

With momentum seemingly behind its adoption for cold storage, and obvious advantages over HDDs for this data storage model, tape is set to retain a compelling advantage “probably for the next decade,” reckons Johns. That is, unless one of the aforementioned exciting storage breakthroughs ever makes it out of R&D labs.

Razer BlackWidow V4 75% Review: It’s What’s Inside That Counts

It seems like 2023 is the year gaming companies have decided to get serious about keyboard customizability — physical customizability, that is. The Razer BlackWidow V4 75% may not look like anything out of the ordinary (in fact, it looks almost exactly like a smaller version of the BlackWidow V4 Pro), but this is actually Razer’s first hot-swappable gaming keyboard

The BlackWidow V4 75% is a wired mechanical gaming keyboard with a compact 75-percent layout and a detachable, padded leatherette wrist rest. For the keyboard enthusiasts it has an aluminum top plate, a hot-swappable PCB, and a gasket-mounted design with two layers of sound-dampening foam, and for the gamers it has N-key rollover, polling rates of up to 8,000 Hz, and bright, customizable per-key RGB (with side underglow).   

The BlackWidow V4 75% is available now in black, with Razer’s orange tactile switches, for $189.99. A white version will be released in mid-September, and will cost an extra $10 ($199.99). Razer is also selling standalone 36-packs of its mechanical switches (orange/tactile, green/clicky, yellow/linear) for $24.99 each, but the keyboard will only come with orange switches installed. 

 Design and Construction of the BlackWidow V4 75% 

(Image credit: Tom’s Hardware)

At a glance, the BlackWidow V4 75% looks like a smaller version of the BlackWidow V4 Pro — wrist rest and all. It doesn’t have the V4 Pro’s triple-side underglow, nor does its underglow extend to the wrist rest when connected, but otherwise the new V4 75% shares a very similar overall design, housed in a sturdy black ABS plastic chassis with a matte black aluminum alloy top plate, and machined metal media keys/roller.

The BlackWidow V4 75% has a compact 75-percent layout, which is slightly smaller than a TKL layout, with a single column of four navigation keys (versus the TKL’s three-column, two-row cluster of six). While many 75-percent keyboards leave some space between the main keys, arrow keys, and navigation keys, the V4 75% does not — the navigation keys and arrow keys are right next to the main keys. This shaves off a few millimeters from the keyboard’s overall length: the BlackWidow V4 75% measures 12.6 inches (321mm) long, versus Asus ROG Azoth’s length of 12.83 inches (326mm). It’s a bit wider than other keyboards — 6.1 inches (155.5mm), thanks to a slope at the bottom that allows the wrist rest to nestle up against it.

It’s sturdily constructed but not overly heavy, weighting 1.8lbs (815g) — not nearly as heavy as the Azoth (2.61lbs / 1186g), but a little heavier than the SteelSeries Apex Pro TKL Wireless (1.65lbs / 747g). On the bottom, you’ll find four rubber non-slip grips and two sets of flip-out feet, which add an extra 6 degrees or 9 degrees of height to the back of the keyboard.

Razer BlackWidow V4 Pro (Image credit: Tom’s Hardware)

The BlackWidow V4 75% comes with Razer’s doubleshot ABS keycaps, which are full-height, lightly textured, and have shine-through primary legends and printed (white) secondary legends. While I had no issues with the keycaps in my testing, I’ve found that Razer’s ABS keycaps tend to start showing wear within the first few months of use. 

I’ve been using the BlackWidow V4 Pro intermittently since it launched six months ago, and several of the keycaps are already shiny from wear. While shininess is something that tends to come pretty quickly on ABS keycaps, it seems to come quicker on Razer’s.

In the upper right corner, the BlackWidow V4 75% features a volume roller and two media keys, all made of machined metal. Interestingly, the right media key, which has a mute symbol etched into it, can be reprogrammed using Razer’s Synapse 3 software. But the left media key, which has a generic circle etched into it, cannot be reprogrammed — this media key is set to play/pause on a single tap, skip to the next track on a double-tap, and go back to the previous track on a triple-tap.

The BlackWidow V4 75% comes with a detachable magnetic leatherette wrist rest, which is padded and has a woven-textured surface. The wrist rest looks like a smaller version of the one that comes with the BlackWidow V4 Pro, but simpler — it doesn’t have a connection point to transfer the keyboard’s underglow, because the V4 75% doesn’t have full underglow (it has underglow on either side, but not along the bottom).

(Image credit: Tom’s Hardware)

Also in the box: a 6.5-foot (2m) detachable USB-C to USB-A cable, and a combination keycap/switch puller.

Specs

Swipe to scroll horizontally
Switches Razer Orange (Linear)
Lighting Per-key RGB, underglow (sides)
Onboard Storage Yes (5 profiles)
Media Keys Yes, volume roller
Game Mode Yes
Connectivity USB-C
Additional Ports 0
Keycaps Double-shot ABS
Construction Aluminum top plate
Software Synapse 3
Dimensions (LxWxH) 12.6 x 6.1 x 0.94 inches / 321 x 155.5 x 24mm
Weight 1.8lbs / 815g
MSRP / Price at Time of Review $179.99 / $179.99
Release Date Aug. 17, 2023

Typing and Gaming Experience on the BlackWidow V4 75%

The BlackWidow V4 75% is a wired 75-percent gaming keyboard with N-key rollover, a polling rate of up to 8,000 Hz, and Razer’s third-gen orange tactile mechanical switches. In a bid to appeal to the keyboard enthusiast side of gamers, the V4 75% also features a tape-enhanced hot-swappable PCB (which accepts both 3- and 5-pin switches) and a gasket-mounted design with two layers of dampening foam. 

The BlackWidow V4 75% comes with Razer’s third-gen orange tactile mechanical switches, which have an actuation force of 50g, an actuation point of 2mm, and a full travel distance of 3.5mm. These switches have a mild tactile bump but are relatively quiet otherwise, and feel similar to Cherry MX Brown switches — albeit slightly heavier, as the MX Browns have an actuation force of 45g. Razer’s switches are rated for 100 million keystrokes and have a box-style cross stem that’s compatible with most third-party keycaps. 

Typing on the BlackWidow V4 75% felt and sounded better than expected — especially if you’re coming from a mainstream gaming keyboard (such as any of Razer’s previous keyboards). The tactile switches combined with the gasket-mounted FR4 plate made for a springy-but-quiet typing experience, even if Razer’s orange switches (and all of Razer’s switches) are slightly stiffer than I prefer. The V4 75% and the V4 Pro sound vastly different — and the V4 Pro doesn’t sound terrible, but you can absolutely hear the tape-modding and sound-dampening foam at work in the V4 75%. 

(Image credit: Tom’s Hardware)

That said, is it the best-sounding keyboard I’ve used recently? Not exactly. I definitely noticed some stabilizer rattle — not in the spacebar, so much, but in the enter and backspace keys — despite the V4 75%’s factory-lubed, plate-mounted stabilizers. But that’s pretty nit-picky; overall, the V4 75% sounds better than 99% of mainstream gaming keyboards on the market. 

As for gaming, the BlackWidow V4 75% performed — as expected — very well, with no latency or lag, though the orange switches’ tactile bump and slightly-heavier actuation force may start to fatigue your fingers if you’re used to smooth, lightweight linear optical switches. 

The keyboard has a default polling rate of 1,000 Hz, but you can bump this up to 8,000 Hz in Razer’s Synapse 3 software. This is a wired keyboard so there’s no concern about a higher polling rate eating up battery life, but it does use more processing power and Razer warns that an 8,000 Hz polling rate “may result in reduced frame rate when playing CPU bound games.” I didn’t notice any reduced frame rates in my testing, and I’ve mostly been playing Baldur’s Gate 3, which is CPU-heavy.

Features and Software of the BlackWidow V4 75%

The BlackWidow V4 75% works well out of the box, but can be configured using Razer’s Synapse 3 software. While anyone who knows me knows that I hate Synapse 3, I didn’t hate it quite as much this time — probably because there’s not quite as much to customize on this keyboard.

You can use Synapse 3 to remap keys (two layers, thanks to Razer’s “HyperShift” duplication tech), adjust the polling rate, and configure the keyboard’s per-key RGB and side underglow, either with preset quick effects or using Razer’s unnecessarily advanced Chroma Studio. As I mentioned earlier, the media keys are sort of programmable — you can program the roller (up/down) and the mute key (single press only), but you cannot program the play/pause key. 

It’s a little strange that you can’t at least program the same number of layers for the mute key as exist on the play/pause key, but perhaps Razer will update that in one of its future hourly Synapse 3 updates.  

 The Bottom Line 

The BlackWidow V4 75% is a pretty solid offering for Razer’s first hot-swappable keyboard, but it’s far from perfect. I’m a little disappointed that it looks so… boring, considering it’s Razer’s foray into a new category — the chunky black on black on black just isn’t doing it for me. Also, while it is fairly (physically) customizable once you get your hands on it, it’s disappointing that it’s not being offered with different switch options, or with different keycap options (Razer doesn’t make its PCB keycaps in a 75-percent layout and doesn’t plan to at the moment), and that the white version is more expensive (I know it’s only $10, but still). 

But for $189.99, it’s one of the better-sounding gaming keyboards you’ll find. I’m personally a bigger fan of the Asus ROG Strix Scope II 96 Wireless’ (also $179.99) sound, but if you’re looking for something smaller than an almost-full-size keyboard, the BlackWidow V4 75% is a good choice.

MORE: Best Gaming Keyboards

MORE: How to Pick Keycaps for Your Mechanical Keyboard

MORE: How to Build a Custom Mechanical Keyboard

Magewell’s New M.2 Capture Cards Are Fit for Mini-ITX Streaming PCs

Capture card manufacturer Magewell has unveiled two new highly-capable 4K capture cards sporting a highly unusual M.2 form factor. The new card’s are known as the Eco Capture HDMI 4K Plus M.2 and Eco Capture 12G SDI 4K Plus M.2. As the names suggest, one model is designed to work with HDMI connectors while the other is designed to work with SDI ports, which is a video connector used by the professional video production industry.

M.2 capture cards are not something you often hear about, but the ultra-compact form factor has many advantages. One of the biggest advantages is integration with newer motherboards that sacrifice most of their smaller PCie x8, x4, and x1 slots for M.2 slots. In these cases, having an M.2 capture card instead of a traditional half-height or full-height PCIe card can be really effective, especially if a system is already using the remaining one or two standard PCIe slots for graphics cards, audio cards, and/or ethernet cards.

Another use is with Mini-ITX motherboards which have even fewer PCIe slots than ATX and micro-ATX motherboards. In cases where the primary and only PCIe x16 slot is being used, M.2 is the only way to add additional PCIe devices to the system. Lots of Mini-ITX boards have more than one M.2 slot as well, which will allow users to build a full Mini-ITX system without sacrificing M.2 storage.

With these capture cards, streamers, video enthusiasts, and professionals can re-route all their video encoding and video processing to the capture card. This offloads work from the CPU and GPU, freeing up resources and improving image quality in some cases (depending on how slow or old the CPU or GPU is). Most people will find the built-in encoders found in the best GPUs and best CPUs to be more than adequate. But, a dedicated capture card can still be beneficial for highly-demanding setups that require more processing power than what a built-in CPU/GPU encoder can provide.

The M.2 card’s themselves come in a 22 x 80 mm form factor, similar to that of 2280-sized M.2 SSDs. Both cards come with a green PCB and feature a large black cooling solution on top, with a very tiny fan actively cooling the encoding chip underneath. Since the M.2 standard does not feature any external ports, the cards need to be used with a special adapter that connects the card to a full-sized HDMI or SDI connector.

According to Magewell, these two new cards offer double the frame rate of their previous versions, featuring 60FPS playback at resolutions of up to 4096×2160 (ie. 4K resolution). Both cards are compatible with Windows and Linux operating systems and support native video APIs like DirectShow, DirectKS, V4L2, and ALSA. Plus, they also support high-quality upscaling, downscaling, cross-scaling, and color space conversion.

Pricing has not been unveiled just yet, but the Eco Capture HDMI 4K Plus M.2 is reportedly now shipping, and the Eco Capture 12G SDI 4K Plus M.2 will be available in the next two months.

Samsung to Produce 300-Layer V-NAND in 2024: Report

Samsung Electronics is poised to employ double-stack architecture for its 9th Generation 3D-NAND memory when it starts its production next year, according to a DigiTimes report that cites Seoul Economic Daily. The sets Samsung apart from SK Hynix, which uses three stacks of NAND to build its 321-layer 3D NAND devices when they enter mass production in the first half of 2025.

Samsung’s 9th Generation V-NAND with over 300-layers is set to hinge on the double-stack technique, which Samsung first embraced in 2020 with its 7th generation 176-layer 3D NAND chips. This method involves production of one 3D NAND stack on a 300 mm wafer and then building another stack on top of the first one. Samsung’s 300-layer 3D NAND will increase storage density produced on one wafer and will enable makers to build lower-cost SSDs or make the best SSDs cheaper.

Contrarily, rival SK Hynix has revealed its intent to kick off the production of its 321-layer 3D NAND in 2025 using a triple-stack approach. This procedure, distinct from Samsung’s, will involve creating three distinct sets of 3D NAND layers, which will increase the number of steps and usage of raw materials, but is meant to maximize yields as it is easier to produce 3D NAND stacks with fewer layers.

Industry speculations that rely on leaked roadmaps suggest that post their 9th Generation 3D NAND, Samsung might adopt a triple-stack methodology for its 10th generation 430-layer 3D NAND. Some experts told Seoul Economic Daily that surpassing 400 layers in 3D NAND would necessitate usage of three separate stacks of 3D NAND, possibly due to yields concerns. Meanwhile, this will naturally increase usage of raw materials and increase costs per 3D NAND wafer. 

Samsung’s long-term vision, as presented at the Samsung Tech Day 2022 last October, aspires to reach up to 1,000 layers by 2030. 

Nvidia Makes 1,000% Profit on H100 GPUs: Report

Nvidia is raking in up to 1,000% in profit for each H100 GPU accelerator it sells, according to estimates made in a recent social media post from Barron’s senior writer Tae Kim. In dollar terms, that means that Nvidia’s street-price of around $25,000 to $30,000 for each of these High Performance Computing (HPC) accelerators (for the least-expensive PCIe version) more than covers the estimated $3,320 cost per chip and peripheral (in-board) components. As surfers will tell you, there’s nothing quite like riding a wave with zero other boards on sight.

Kim cites the $3,320 estimated cost for each H100 chip as coming from financial consulting firm Raymond James. It’s unclear how deep that cost analysis goes, however: if it’s a matter of pure manufacturing cost (averaging the price-per-wafer and other components while taking yields into account), then there’s still a significant expense margin for Nvidia to cover with each of its sales. 

Product development takes time and resources; and considering the engineers and other participants within the product development lifecycle for a product such as H100, Nvidia’s R&D costs also have to be taken into account before a final, average product development cost can be reached. According to Glassdoor, Nvidia’s average salary for an electronics hardware engineer sits at around $202,000 per year. And that’s for a single engineer – it’s likely development of chips such as the H100 require thousands of hours from a number of specialized workers. All of that too has to be taken into account.

Even so, it can’t be denied that there are advantages at being the poster company for AI acceleration workloads. By all accounts, Nvidia GPUs are flying off the shelves without even getting personal with the racks they’re piled on. This article in particular looks to be the ultimate playground for anyone trying to understand exactly what the logistics behind the AI HPC boom means. What that actually translates to, however, is that it seems that orders for Nvidia’s AI-accelerating products are already sold through until 2024. And with expectations of the AI accelerator market being worth around $150 billion by 2027, there’s seemingly nothing else in the future but green.

And of course, that boom is better for some than others: due to the exploding interest in AI servers compared to more traditional HPC installations, DDR5 manufacturers have had to revise their expectations on how much the new memory products will penetrate the market. It’s now expected that DDR5 adoption will only hit parity with DDR4 by 3Q 2024. 

The company is reaping the benefits of having built its infrastructure and product stack on the back of its tooling and bets on AI as the next big thing.

But budgets everywhere tend to have limits, and there’s also the matter of opportunity cost: investing into AI acceleration at its current prices will lock some players’ doors to investments in other spaces, or limit how much risks they can take in pursuing less-safe research and development venues.

When all is said and done, Nvidia’s bottom line as buoyed by its AI chip sales could now cover inane amounts of leather jackets. At least from that perspective, Nvidia CEO Jensen Huang has to be beaming left and right.

Maingear MG-1 Silver (Shroud Edition) Review: Strong Starter Rig

Entry-level builds, even budget builds, have crept up in price. Often times, despite paying $1,000 or more, you can still wind up with something that feels cheap. That’s not the case with the Maingear MG-1 Silver ($1,149 to start, $1,249 as tested), a desktop that feels premium, even if it has the performance of a starter rig.

The Intel Core i5-13400F and Nvidia GeForce RTX 4060 don’t have the power of some of the best gaming PCs, but the system plays games well enough, especially if you’re willing to adjust settings. We only found one component – the SSD – to be subpar. But that’s replaceable, or at least can be augmented with another one down the line.

But most importantly, you still get a fairly premium chassis, meticulous cable management, and a sense that this is a computer you can grow with. When you’re ready to spend more or learn to upgrade yourself, this case has nothing proprietary, and it will be able to grow with you.

Design of the Maingear MG-1 Silver

Maingear’s MG-1 Silver is, in many ways, a classic mid-tower, though it’s compact enough to fit on most desks. The chassis is a black aluminum box with a window on the left side panel to let you see your parts and RGB lighting. The blue power and LED buttons on top serve as small accents that you won’t see much.

At 19 x 16.88 x 8.13 inches, the Maingear MG-1 is fairly compact. The Alienware Aurora R16, newly redesigned to be a more conventional size, is 18.05 x 16.5 x 7.76 inches, which is slightly smaller.

The faceplate on the front of the Maingear system attaches magnetically and can easily be pulled off and reattached. A small connector on the bottom of the faceplate provides power to RGB lighting behind the faceplate (which creates a glowing effect around it) and in the Maingear logo). Our review unit is Maingear’s “Shroud Edition,” named after the pro esports player and streamer, so it included a black front panel with Shroud’s logo on it. It also included a limited time “Into the Wild” faceplate based on his latest fashion line. But you can also design your own faceplate ($99 each), and Maingear tossed one it made in our box, which featured a magical space theme and the Tom’s Hardware logo.

That front panel sits in front of three 120 mm intake fans, while another fan in the back exhausts heat from the case. Our unit featured a Cooler Master Hyper 212 air cooler, but there is room in the top of the chassis for up to a 360 mm radiator for an all-in-one cooler in more expensive configurations. Regardless of the lack of fans up top, I wish there were some sort of dust filter in place. In fact, I wouldn’t mind seeing dust filters by the front intake fans, either. (The only one I spotted was underneath the power supply).

Maingear’s RGB is powered by a controller mounted on the right side of the case, and can be adjusted with a remote control. While this let me change colors, brightness, and patterns, it only adjusted the rear fan and a lightstrip. The other parts with RGB, like the RAM, didn’t work with the remote, however. That led me to install MSI Center with its Mystic Light add-on for lighting control. While our motherboard doesn’t have lighting, it can control the RAM and let me match it to the rest of the system. (And it takes a lot to make me want to use MSI Center!).

While our review unit came with a micro-ATX motherboard, there’s room in the system for a larger, full-sized ATX board.

Maingear MG-1 Silver (Shroud Edition) Specifications

Swipe to scroll horizontally
Processor Intel Core i5-13400F
Motherboard MSI PRO B660M-A CEC WiFi DDR4
Memory TeamGroup T-Force Delta 16GB DDR4-3600
Graphics PNY GeForce RTX 4060 8GB Verto (8GB GDDR6, 2,460 MHz boost clock)
Storage 512GB Solidigm P41 Plus m.2 NVMe SSD
Networking Intel Wi-Fi 6E AX211, Bluetooth 5.2
Front Ports 2x USB 3.2 Gen 1 Type-A, USB 3.2 Gen 2 Type-C, 3.5 mm headphone jack, RGB LED button
Rear Ports (Motherboard) USB 3.2 Gen 2 Type-A, USB 3.2 Gen 2 Type-C, 4x USB 3.2 Gen 1 Type-A, 2x USB 2.0, Ethernet, HDMI, audio connectors
Video Output (GPU) HDMI, 3x DisplayPort
Power Supply 600W EVGA B
Cooling Cooler Master Hyper 212 air cooler, 4x 120mm case fans
Operating System Windows 11 Home
Dimensions 19 x 16.88 x 8.13 inches
Other Shroud front panel, “Into the Wild” front panel
Price as Configured $1,249

Ports and Upgradeability on the Maingear MG-1 Silver

On the front panel, there’s a pair of USB 3.2 Gen 1 Type-A ports and a USB 3.2 Gen 2 Type-C port, as well as a 3.5 mm headphone/mic jack. There’s also a dedicated button for switching between RGB states next to the power button.

(Image credit: Tom’s Hardware)

The back panel will depend on which motherboard you get. On the rear of the MSI PRO B660M-A CEC WiFi DDR4 in our review unit, that meant USB 3.2 Gen 2 Type-A, USB 3.2 Gen 2 Type-C, four USB 3.2 Gen 1 Type-A, ports, two USB 2.0 ports, Ethernet, HDMI, and audio connectors.

(Image credit: Tom’s Hardware)

For video, that also depends on the GPU. The RTX 4060 in the Silver we tested has a single HDMI port and four DisplayPort connectors.

Opening the MG-1 is simple. The left side panel, with the glass window, is held on by two thumb screws. With those removed, it just slides off. It’s the same for the right side panel, but the entire side comes off there. Cable management is all neat and tidy; whoever built this had more patience than I do. Some of that has to do with the RGB and fan controller in the back of the system, with much of the cable mess tied and connected there, rather than in front of the motherboard.

(Image credit: Tom’s Hardware)

When you’re in, upgrading the MG-1 is much like changing parts on anything you’ve built yourself. There’s nothing proprietary here. That means you have full access to the motherboard, cooler, fans, RAM, CPU, GPU, power supply, or anything else you’d want to change within the case. Some quick upgrades you could make in our review unit include adding a second SSD (there’s another slot on our motherboard and mounts on the back for SATA drives, though with the way the power and data cables are tied up, you may have a hard time getting them hooked up) or adding more RAM. Eventually, you could replace the entire motherboard if you wanted to switch CPU platforms.

Gaming and Graphics on the Maingear MG-1 Silver

Most companies that lend gaming PCs for review don’t send the low-end configuration (presumably, they want to only be affiliated with the most powerful options). So when Maingear sent the MG-1 Silver with an Intel Core i5-13400F, 16GB of RAM, a  GeForce RTX 4060 and 512GB m.2 NVMe SSD, we had a limited test pool for comparison. So here, we’re comparing it to the recently reviewed Alienware Aurora R16, with a Core i7-13700F, 32GB of DDR5 RAM and an RTX 4070. It’s not exactly competition, but shows what you can get if you have an extra $600 to spend.

I used the Maingear to play Returnal, and while the game suggested low settings, it did decently on the high preset, even at 4K. With that combination of graphics settings and resolution, it still ran between 48 and 59 frames per second in early areas of the game, hitting the lower ranges in combat with aliens.

If you have a high refresh rate monitor, you may want to turn down some settings, but the MG-1 Silver performed decently in our benchmarks.

On Shadow of the Tomb Raider (highest settings), the Maingear hit 104 fps at 1080p and eked out 30 fps at 4K.

When it came to Grand Theft Auto V (very high), the game hit 95 fps at 1080p and 28 fps at 4K.

Far Cry 6 (ultra) was the Maingear’s best attempt at 4K, hitting 38 fps, but you might be more comfortable with the 91 fps at 1080p.

Red Dead Redemption 2, at medium settings, is still a challenge for many computer. The Maingear reached 71 fps at 1080p, but an unplayable 21 fps at 4K.

On Borderlands 3 (badass preset), the Maingear reached 104 fps at 1080p and a flat 30 fps at 4K.

To stress test systems, we run the Metro Exodus benchmark at RTX settings at 1080p for 15 runs, simulating about half an hour of game play. The Maingear achieved an average frame rate of 57 fps and was largely consistent within a few decimal points throughout the tests. The CPU’s six performance cores averaged 3.33 GHz, while the four efficiency cores ran at 2.62 GHz. The CPU package measured 43.1 degrees Celsius. The GPU ran at 1.9 GHz and measured 53.55 degrees Celsius.

Productivity Performance

Many people use their gaming PCs as productivity machines as well. The Intel Core i5-13400F and RTX 4060 in the Maingear rig should be plenty for most everyday tasks, though the SSD leaves something to be desired. It’s unsurprising that the Maingear lost out to the Alienware Aurora in these tests, with its more powerful 13th Gen Core i7, but it’s also a more expensive machine.

On Geekbench 6, a CPU-heavy synthetic benchmark, the MG-1 Silver earned a single-core score of 2,330 and a multi-core score of 9,743. The difference in single-core between the MG-1 and the Alienware wasn’t bad, but the i7, with more cores, reached 16,687 points.

The big negative for the Maingear rig was on our file transfer test. The MG-1’s 512GB Solidigm P41 Plus copied 25GB of files at 489.65 MBps, which is slower than our hopes for any PC these days. When we reviewed that drive, we said it had “largely mediocre performance.”

When we used Handbrake to transcode a 4K video to 1080p, it took the MG-1 and its Core i5 5 minutes and 24 seconds to complete the task. The more powerful, pricier Alienware did it in 3:52.

Software and Warranty on the Maingear MG-1 Silver

For better or worse, our Maingear MG-1 Silver review unit came without much software preinstalled, other than what you need for the Nvidia GPU. I do wish the RGB controller had some software. If I wanted something for any of the components, I’d need to add it myself.

That being said, there’s really no bloatware. The only real junk comes from the Windows 11 side of things, including links in the start menu to install apps like Whatsapp, Netflix, Prime Video, Messenger, and Instagram.

By default, Maingear includes a 1-year warranty and “lifetime support.” You can upgrade to a 2 or 3-year warranty for $99 and $199, respectively.

Maingear MG-1 Silver Configurations

Maingear offers its MG-1 with a wide variety of configuration options, as well as a series of pre-built, ready-to-ship configs.

We tested the Maingear MG-1 Silver Shroud Edition, with an Intel Core i5-13400F,

MSI PRO B660M-A CEC WiFi DDR4 motherboard, 16GB of RAM, a 

GeForce RTX 4060 and 512GB Solidigm P41 Plus m.2 NVMe SSD. The Shroud Edition moniker adds the eponymous esports player and streamer’s logo to the front panel and also includes a secondary front panel matching his latest clothing collection. All of that will cost you $1,249.

If you ditch the Shroud Edition for the regular MG-1 Silver, you’ll get the exact same PC, but without Shroud’s logo or the extra faceplate. That will save you $100 at $1,149, making it the one I would get.

There are numerous other options, including Gold (bumps up to an RTX 4060 Ti, $1,299 for standard, $1,399 for Shroud edition) up through the most expensive Ultimate edition (a Core i9-13900K, RTX 4090, 32GB DDR5 RAM, 2TB SSD, a 1,000W PSU, and a 360mm AIO liquid cooler) which starts at $4,599 (or $4,699 for the Shroud edition).

Mangear also sent an extra faceplate featuring the Tom’s Hardware logo with our loaner. That customized front panel costs $99, should you want to add your own look.

Bottom Line

For those who are just getting into PC gaming, or don’t have the budget to splurge on the top of the line, the Maingear MG-1 Silver is impressive. Sure, it doesn’t have all of the power, bells and whistles that PC gaming can offer, but it’s also less than $1,500. And because the design hsa no proprietary parts like some computer makers use, you have the ability to upgrade everything you want in this case in the future. You could bump up, either through Maingear’s own options or for something like the Alienware Aurora R16, but those upgrades raise the price pretty quick.

One note on price: I don’t know who the Shroud edition is for. Unless you’re a superfan of Shroud, as either a streamer or esports pro, I don’t know why you’d want special edition Shroud faceplates that add $100 to the cost. Especially when you can submit your own design for the same price. Me? I’d rather have that money for games, or a faster, roomier SSD.

The cable management is meticulous, perhaps too much so (you’ll want to get a power cable out eventually), and Maingear could add a few dust filters to the system in order to keep things cleaner in the long run. But despite being a fairly cheap gaming system, outside of a sluggish SSD, the Maingear MG-1 Silver never feels like one.

Intel To Make Further Workforce Cuts in U.S. :Report

When Intel announced major workforce cuts to certain divisions in October, 2022, it was considered a drastic move. But apparently, it was not enough, which is why in May, 2023 the company initiated another round of layoffs in its client computing group (CCG) and its data center group (DCG). Apparently, the company is now cutting its research and development personnel in California, according to a Sacramento Inno report.

Intel is laying off 140 employees in California: 89 employees are being let go at Intel’s Folsom campus and 51 jobs will be cut in San Jose. There are 37 job classifications affected at the Folsom site, the primary titles among the affected positions are ‘engineer’ and ‘architect,’ the report claims. Specifically, Intel is laying off 10 GPU software development engineers, eight system software development engineers, six cloud software engineers, six product marketing engineers, and six system-on-chip design engineers.

This latest reduction marks nearly 500 positions removed from the Folsom R&D campus in the current year, following previous cuts in January, March, and May. As of early 2022, Intel had 5,300 employees in Folsom.

Intel’s Folsom campus has been used for a variety of R&D activities, including development of SSDs, graphics processors, software, and even chipsets. Since Intel got rid of its 3D NAND and SSD division in 2021, by now it has either transferred appropriate specialists to Solidigm, or let them go. As a result, it is now laying off its GPU specialists, which is a bit surprising as the company’s GPU software is far from perfect. Perhaps, the company wants to move certain positions to other locations with cheaper workforce, but Intel has yet to comment on the matter.

In correspondence with state officials, Intel mentioned the possibility of relocating affected employees within the company. A spokesman for Intel also noted that Intel retains over 13,000 staff in California and remains committed to investing in fundamental areas of its business, notably its manufacturing operations within the U.S. Meanwhile, the majority of Intel’s U.S. production takes place in Arizona, Oregon, and New Mexico.

 

Save $50 off This Elgato Capture Card: Real Deals

Whether you want to record some footage off of a games console or stream from a dedicated streaming PC the Elgato 4K60 pro could be your answer. Currently, you can save a whopping $50 off of the price of this capture card and pick up the Elgato 4K60 pro for $199 on Amazon.  This is the mark-2 variant and features a low latency 240Hz passthrough to make your streams and recordings as smooth as possible. 

A great price for a QHD 32-Inch monitor sees the LG UltraGear 32GN600-B gaming monitor on sale at Walmart for $209. A fast 165Hz refresh rate, good color representation, HDR10, and AMD FreeSync integration help this monitor to be a decent option for a well-priced main gaming monitor. 

Not too large at just 28 inches the 4K Gigabyte M28U is reduced by $100 to only $429 at Newegg. With great pixel density thanks to its size, and a speedy 144Hz refresh rate, this monitor also sports HDMI 2.1 support, and a KVM switch for easily connecting to different devices. 

Scroll down for more of today’s deals.

TL;DR — Today’s Best Deals

Today’s best deals in detail

Looking for more deals?

Back to Gaming Accessories

Load more deals

Sabrent External SSD Delivers 1TB At 2.7 GB/s For $199

Sabrent has unleashed the company’s new Rocket Nano XTRM (SB-XTMN-1TB) to rival the best external SSDs. Designed to take advantage of Thunderbolt 3, the external SSD offers consumers 1TB of storage with transfer speeds of up to 2,700 MB/s. The Rocket Nano XTRM 1TB retails for $199.99 on Amazon. 

The Rocket Nano XTRM measures 2.7 x 1.1 x 0.5 inches and weighs 2.2 ounces. It’s slightly smaller than your typical bank card. The aluminum body helps keep the weight down while providing heat dissipation benefits. The lightweight and compact design lets you take the external SSD anywhere, including inside your shirt pocket. An optional handy silicon sleeve is available to keep the external SSD safe from accidental drops and harsh environments.

The Rocket Nano XTRM uses a USB Type-C connection, allowing it to connect to your desktop PC, laptop, smartphone, tablet, or any other device with a USB-C port. It’s the preferred interface where the Thunderbolt 3-certified external SSD can hit its advertised 2,700 MB/s transfer speeds. The quoted performance shows that the Rocket Nano XTRM is a worthy rival for the SanDisk Pro-G40, which we consider the best portable SSD.

The Rocket Nano XTRM doesn’t necessarily require a USB Type-C port. It is backward compatible with previous USB ports, such as USB 3.2 Gen2x1, but it’ll cripple the transfer speeds to 900 MB/s. Sabrent includes two cables with the Rocket Nano XTRM: a USB Type-C to USB Type-C cable and a USB Type-C to USB Type-A cable. Regarding compatibility, the external SSD works right out of the box on Windows and macOS operating systems without additional drivers or power cables.

The Rocket Nano XTRM 1TB retails for $199.99 on Amazon. However, Sabrent is running a special sale for the external SSD’s introduction, where it can be yours for $169.99 at the company’s store. The Rocket Nano XTRM comes with a one-year warranty. However, registering your product with Sabrent will extend the warranty period to three years.

AMD Ryzen 5 7500F Now Available in the U.S. and Europe

When AMD launched its six-core Ryzen 5 7500F last month, it only released it in retail in China and said that it would be available in other countries to system integrators — and that end-users would only be able to purchase it inside desktop PCs. It seems those limitations didn’t work out, and the cheapest Zen 4-based CPU is now available both in Europe and North America.

ShopBLT, a store known for selling new hardware ahead of others, is now offering the Ryzen 5 7500F for $176.46 or for $181.83 — which is very close to AMD’s recommended MSRP of $179. AMD’s Ryzen 5 7500F processor is designed to bring the benefits of the AM5 platform to budget-conscious gamers anticipating future upgrades, so it is nice to see that it is available at nearly MSRP.

Unfortunately, the CPU is not in stock and is ‘ordered as needed,’ so for now it is only possible to order it and get it within 15 business days. It is likely that over time the processor will be available from other retailers, such as Amazon or Newegg, but for now is only available for pre-order from ShopBLT.

The situation with AMD’s Ryzen 5 7500F availability is slightly better in Europe, where the unit is available from multiple retailers, including MindFactory in Germany for €202 and from Future-X in Austria and Germany for €202, as revealed by Geizhals.eu and VideoCardz.

Swipe to scroll horizontally
Header Cell – Column 0 Street/MSRP Cores / Threads (P+E) Base / Boost Clock (GHz) TDP / PBP / MTP
Ryzen 5 7600X $249 ($299) 6 / 12 4.7 / 5.3 105W / 142W
Ryzen 5 7600 $229 6 / 12 3.8 / 5.1 65W / 88W
Ryzen 5 7500F $179 6 / 12 3.7 / 5.0 65W / 88W

For now, AMD’s six-core Ryzen 5 7500F CPU is the cheapest desktop Zen 4-based offering from the company. The processor runs at a base clock of 3.70 GHz and can boost up to 5.0 GHz, which is slightly below the clockspeed of the more expensive Ryzen 5 7600. Detailing further, the Ryzen 5 7500F comes with 6MB of L2 cache, 32MB of L3 cache, supports a dual-channel DDR5 memory system, and has 24 usable PCIe Gen5 lanes. Notably, the ‘F’ in its model number denotes the absence of an integrated GPU, necessitating an external graphics card. However, with a 65W TDP and a locked multiplier, this processor is not easily overclockable. 

U.S. Injects $112M into Supercomputing to Enable Fusion Future

They say that good things come in threes, and the U.S. is definitely banking on the  Lawrence Livermore National Laboratory (LLNL) to deliver just that when it comes to cold fusion. Having achieved their second successful fusion ignition with an energy surplus (meaning that more energy was produced than was required to achieve the fusion reaction itself) within a national lab on July 30th, the U.S. now aims to spur research and facilitate a successful third ignition — and beyond. To do that, the country is ready to invest a further $112M into a dozen supercomputing projects.

Fusion (short for nuclear fusion) refers to the ability to fuse two light atoms into a single, heavier one: a process that when successful, leads to the release of massive amounts of energy in the form of electrons. Unlike fission (which works by breaking down heavy elements such as uranium or plutonium), nuclear fusion is expected to be a safe, nearly-unlimited source of energy. When done right, fusing two light atoms (such as deuterium and tritium, each a hydrogen isotope that carries additional electrons compared to “plain” hydrogen) brings about an energy surplus that is more than four times the amount that fission processes can generate. That also makes it a process worth about four million times the amount of energy released from coal burning (at a per-kilogram basis) — its merits are obvious.

It’s on the back of that promise that the newly-instated Scientific Discovery through Advanced Computing (SciDAC) program combines the two pre-existing programs from the Department of Defense with the aim of streamlining programs invested into solving complex fusion energy problems using supercomputing resources, including exascale systems. 

“The modeling and simulation work of these partnerships will offer insight into the multitude of physical processes that plasmas experience under extreme conditions and will also guide the design of fusion pilot plants,” said DoE Associate Director of Science for FES, Jean Paul Allain. 

There’s still a lot of work to achieve a sustainable, surplus-energy fuel ignition that actually rockets humanity into a clean, energy-conscious and abundant future, however. The July 30th fusion ignition did provide a higher energy output than was delivered into the light-atom fuel capsule (although it’s unknown how much better it was than the 2.05 megajoules-in, 3.15 megajoules-out achieved in December of last year), but that only takes into account the energy transmitted unto the pellet itself. Unfortunately, the way that energy is delivered into the pellet (via 192 lasers) is still extremely inefficient — LLNL needed to push a staggering 322 megajoules to fire the lasers themselves, which still left the process on a global energy deficit.

But the way forward is to better understand the quantum processes surrounding fusion. Until quantum computers themselves can provide a viable computing platform that can crack that code (and there’s no telling how long that will take — but it’s likely in the decade mark), supercomputers based on standard computing are the best way we have to look into the ordered chaos of processes that occur when the laser strikes the pellet. 

The $121M will certainly be a boon there — but it definitely won’t be enough. Yet we humans have this strange way of looking farther ahead — of chasing the carrot — than simply focusing on what is right in front of us. This grant injection is part of that, and a healthy injection into the High performance Computing (HPC) landscape — however small a slice of the total it ultimately turns out to be.

Exit mobile version