Get it on Google Play تحميل تطبيق نبأ للآندرويد مجانا

This Device Fast Charges and Automatically Backs Up Data Simultaneously

ExtremeTech

Data has become a commodity – and it’s of most value to those who create it. Devices like your smartphone contain a huge amount of data, both sensitive information like passwords, contact information, and credit cards, but also the information created day to day, like images, videos, emails, and documents. If you’re not properly backing up your information, you risk losing it. Who hasn’t accidentally damaged an iPhone, and lost months’ worth of communication, images, and more, never to be seen again because you didn’t back them up? The information on our smartphones has become more valuable than that on our computers – and backing up your data is more important than ever. Now, there’s a solution that’s easy and quick, meaning you don’t have to pay expensive, ongoing subscription fees, or buy a bulky piece of desktop hardware.

Right now, you can get the AnyBackup: Offline AutoBackup & 100W FastCharge device for the reduced price of $59.95, a reduction of 14 percent off the full ordinary price of $70. Charge your device and back it up at the same time with this convenient little device.

AnyBackup works totally offline to automatically back up all the data from your device as well as from any third-party platform. It connects to your device and also supports fast charging while doing its automatic backup steps. The device is capable of reading, backing up, transferring, and restoring your documents, contacts, photos, videos, and data from all the popular social media channels, while charging devices in 30 minutes. The device supports MicroSD, flash drive, and portable hardware drives up to 2TB, and supplies multi-layer data encryption to secure your data with FaceID, fingerprints, and passwords. The Kickstarter- and Indiegogo-funded backup device is compatible with iOS, Android, macOS, and Windows devices, and is controllable through its easy-to-use companion app.

The AnyBackup: Offline AutoBackup & 100W FastCharge is on sale for $59.95, meaning you can confidently and conveniently backup your devices and worry less about losing your valuable data.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 23rd 2021, 4:40 pm

41 Percent of Those Planning to Buy a Car May Go Electric, Study Finds

ExtremeTech

(Photo: Ernest Ojeh/Unsplash)

(Photo: Ernest Ojeh/Unsplash)
A new study from Ernst & Young has revealed that out of those looking to buy their next car, 41 percent are interested in going electric. EY surveyed consumers from 13 major international markets (Australia, Canada, China, Germany, India, Italy, Japan, New Zealand, Singapore, South Korea, Sweden, the UK, and the US) and found that the degree of interest varies by country—only 17 percent of Australian consumers are thinking of going electric, compared with 48 percent of those in China and 63 percent in Italy. But overall, the shift marks a significant 11 percent jump in EV interest from EY’s study last year.

“This represents a breakthrough moment in consumer attitudes that could hugely accelerate demand for EVs and alternative powertrain vehicles,” the study concluded. “Previously, many consumers expressed generalized concerns over sustainability, but those concerns did not translate into action when it came to buying their next car.”

Powertrain preference of consumers planning to buy a car (Graph: Ernst & Young)

The study comes at what many are hoping to be the tail-end of the COVID-19 pandemic, during which consumers spent more time in their own neighborhoods. For many, quarantine offered some people a bittersweet opportunity to recognize the pleasure of having cleaner, less-polluted air and take inventory of the ways in which an individual can affect the environment. According to EY’s data, this pandemic-induced epiphany is the reason behind 78 percent of potential EV buyers’ willingness to ditch the carbon-burning engine and seek out a better solution.

Buying an EV is now easier than ever, but some consumers remain wary. Out of the consumers EY surveyed, 50 percent indicated cost as a potential barrier to purchasing their first EV, and 32 percent were concerned about charging infrastructure. Although many electric vehicles have a higher up-front cost than their carbon-burning predecessors, their total cost of ownership (TCO) is often lower thanks to fuel and maintenance savings. And while anything new can at first feel intimidating, implementing charging infrastructure isn’t nearly as complicated as it sounds.

In the US and UK, for example, you can generally install an at-home EV charger if you own your property or if you obtain permission from your landlord. Home EV chargers can go for as little as $199 for a lesser-known brand on Amazon or $700+ for the luxury brand Chargepoint. (Surprisingly, a Tesla Wall Connector sits in the middle at $500.) According to the US Department of Energy’s Alternative Fuels Data Center, an EV with a 200-mile/66kWh range would cost only $9 to fully charge at a cost of $0.13 per kWh. Compare that with a $3.50/gallon fuel cost and your average 25mpg carbon-burning vehicle with a 16-gallon tank (which costs $56 to fill) and the long-term savings are abundant—you have about $18 left in your pocket every time you recharge. If you drive enough to recharge even just twice a month, a mid-level charger will pay for itself in less than a year.

Of course, this only works if you drive a smaller vehicle. Delivery professions and trades that require hauling of larger goods or materials have far fewer options, as electric vans and pickups are only just entering the mainstream and carry hefty price tags, and long-haul electric trucks seem further away than ever. And if you live in a place like San Francisco that requires you to park out on the street, owning an electric vehicle may be more work than it’s worth. As such, EVs aren’t a blanket solution to transportation-related pollution, but rather a supplement to other environmental efforts. 

EY (and likely many consumers) are hoping this marks a transformation in the way consumers approach personal transportation and the environment. Of those surveyed, 66 percent of consumers indicated they’d be making their purchase within the next year, which means both the auto industry and the planet could begin seeing changes very soon. 

Now Read:

July 23rd 2021, 2:53 pm

Why You Should (Or Shouldn’t) Root Your Android Device

ExtremeTech

Modern smartphones are marvels of technology. With more processing power than the desktop PCs of yesteryear, you can find any piece of information in the world, watch the latest episode of Ted Lasso, and snap photos that are worthy of framing. But that’s just the start — there’s a lot more power under the hood of Android if you’re willing to root your phone. In the first few years of Android’s existence, this was a fairly simple procedure on most devices. There were even apps and tools that could root almost any Android phone or tablet with a tap, and you’d be ready to truly master your device in mere minutes. As Android became more capable, the allure of rooting has diminished somewhat — it’s also much harder and comes with more drawbacks.

The advantages of rooting

Gaining root access on Android is akin to running Windows as an administrator. You have full access to the system directory and can make changes to the way the OS operates. As part of rooting, you install a management client like Magisk — SuperSU used to be the top option but has fallen into disrepair. These tools are basically the gatekeeper of root access on your phone. When an app requests root, you have to approve it using the root manager.

In the case of Magisk, you can also use the client to make other changes to the phone via numerous community-developed modules. Let’s say you don’t like the system theme on your phone. With root, you can change that. You can also manually back up app data so you never lose it again. Want to change the way your device’s CPU characteristics? That’s also possible with root.

If you’ve ever looked at your phone and thought, “I wish I could do [some very specific thing],” rooting might make it happen. Modern tools like Magisk are also “systemless” root managers. That means the changes are stored in the boot partition rather than modifying the system. That makes it easier to go back to an unrooted system (or make apps think you’re unrooted) than it used to be.

The Risks of Rooting

Rooting your phone or tablet gives you complete control over the system, but honestly, the advantages are much less than they used to be. Google has expanded the feature set of Android over the years to encompass many of the things we used to need root to do. With that in mind, there are risks to rooting, and you should only do it if you know what you’re getting into. Android is designed in such a way that it’s hard to break things with a limited user profile. A superuser, however, can really trash the system by installing the wrong app or making changes to system files. The security model of Android is also compromised when you have root. Some malware specifically looks for root access, which allows it to really run amok.

For this reason, most Android phones are not designed to be rooted. There’s even an API called SafetyNet that apps can call on to make sure a device has not been tampered with or compromised by hackers. Banking apps, Google Pay, and others that handle sensitive data will do this check and refuse to run on rooted devices. Magisk supports hiding root, but that won’t always work. It’s a constant game of cat and mouse with Google. If losing access to high-security apps is a big deal, you might not want to mess around with rooting.

Root methods are sometimes messy and dangerous in their own right. You might brick your device simply trying to root it, and you’ve probably (technically) voided your warranty doing so. Rooting also makes it harder (or impossible) to install official updates, and ROMs like Lineage can be difficult to install and buggy once you do. If having root access is really important to you, you might be left waiting on flawed software while you beg for a new root method or a modded OS update.

Should You Do It?

If you’ve been using Android for a while, you’ve probably noticed gaining root access on most devices is much harder than it once was. There were exploits years back that could root almost any Android device in a few minutes, but that’s much less common now. The last essentially universal exploit was Towelroot in mid-2014, but Google patched that rather quickly. Google patches these flaws often before we even know they exist because having active exploits in the system is a very bad thing for most users. These are security holes that can be utilized by malware to take over a device and steal data. There are monthly security updates to patch these holes, but on a rooted phone, you are responsible for security. If you’re going to root, you have to accept that your device will require more frequent attention, and you need to be careful what you install. The security safety net offered by Google and the device maker won’t be there to save you.

If you’re not familiar with Android’s tools and how to fix issues with a command line, you probably shouldn’t dive into rooting your phone. Root can be a lot of fun to play around with, but it can also lead to plenty of frustration as you try to fix errors caused by overzealous modding. If you bought your phone with the intention of tinkering, by all means, go nuts.

When something does go wrong (and it will at some point), it’s all on you to fix it. You might be left scouring old forum posts and begging for help in chatrooms to fix your phone. You have to be willing to tackle some vexing issues if you’re going to live the rooted lifestyle. You also have to look at what you’re getting; Android in its unmodified state is much better than it used to be. A decade ago, people rooted phones to get features like imposing low-power sleep for apps, managing permissions, and taking screenshots. Unmodified Android can do all of that now. Most people just don’t have a good reason to root phones anymore.

Now Read:

July 23rd 2021, 2:06 pm

Why You Can’t Future-Proof Your Gaming PC

ExtremeTech

Talk to anyone about building a new PC, and the question of longevity is going to pop up sooner rather than later. Any time someone is dropping serious cash for a hardware upgrade they’re going to have questions about how long it will last them, especially if they’ve been burned before. But how much additional value is it actually possible to squeeze out of the market by doing so — and does it actually benefit the end-user?

Before I dive in on this, let me establish a few ground rules. I’m drawing a line between buying a little more hardware than you need today because you know you’ll have a use for it in the future and attempting to choose components for specific capabilities that you hope will become useful in the future. Let me give an example:

If you buy a GPU suitable for 4K gaming because you intend to upgrade your 1080p monitor to 4K within the next three months, that’s not future-proofing. If you bought a Pascal GPU over a Maxwell card in 2016 (or an AMD card over an NV GPU) specifically because you expected DirectX 12 to be the Next Big Thing and were attempting to position yourself as ideally as possible, that’s future-proofing. In the first case, you made a decision based on the already-known performance of the GPU at various resolutions and your own self-determined buying plans. In the second, you bet that an API with largely unknown performance characteristics would deliver a decisive advantage without having much evidence as to whether or not this would be true.

Note: While this article makes frequent reference to Nvidia GPUs, this is not to imply Nvidia is responsible for the failure of future-proofing as a strategy. GPUs have advanced more rapidly than CPUs over the past decade, with a much higher number of introduced features for improving graphics fidelity or game performance. Nvidia has been responsible for more of these introductions, in absolute terms, than AMD has.

Let’s whack some sacred cows:

DirectX 12

In the beginning, there were hopes that Maxwell would eventually perform well with DX12, or that Pascal would prove to use it effectively, or that games would adopt it overwhelmingly and quickly. None of these has come to pass. Pascal runs fine with DX12, but gains in the API are few and far between. AMD still sometimes picks up more than NV does, but DX12 hasn’t won wide enough adoption to change the overall landscape. If you bought into AMD hardware in 2013 because you thought the one-two punch of Mantle and console wins were going to open up an unbeatable Team Red advantage (and this line of argument was commonly expressed), it didn’t happen. If you bought Pascal because you thought it would be the architecture to show off DX12 (as opposed to Maxwell), that didn’t happen either.

Now to be fair, Nvidia’s marketing didn’t push DX12 as a reason to buy the card. In fact, Nvidia ignored inquiries about their support for async compute to the maximum extent allowable by law. But that doesn’t change the fact that DX12’s lackluster adoption to-date and limited performance uplift scenarios (low-latency APIs improve weak CPU performance more than GPUs, in many cases) aren’t a great reason to have upgraded back in 2016.

DirectX 11

Remember when tessellation was the Next Big Thing that would transform gaming? Instead, it alternated between having a subtle impact on game visuals (with a mild performance hit) or as a way to make AMD GPUs look really bad by stuffing unnecessary tessellated detail into flat surfaces. If you bought an Nvidia GPU because you thought its enormous synthetic tessellation performance was going to yield actual performance improvements in shipping titles that hadn’t been skewed by insane triangle counts, you didn’t get what you paid for.

DirectX 10

Everybody remember how awesome DX10 performance was?
Anybody remember how awesome DX10 performance was?

Anybody?

If you snapped up a GTX 8xxx GPU because you thought it was going to deliver great DX10 performance, you ended up disappointed. The only reason we can’t say the same of AMD is because everyone who bought an HD 2000 series GPU ended up disappointed. When the first generation of DX10-capable GPUs often proved incapable of using the API in practice, consumers who’d tried to future-proof by buying into a generation of very fast DX9 cards that promised future compatibility instead found themselves with hardware that would never deliver acceptable frame rates in what had been a headline feature.

This is where the meme “Can it play Crysis?” came from. Image by CrysisWiki.

This list doesn’t just apply to APIs, though APIs are an easy example. If you bought into first-generation VR because you expected your hardware would carry you into a new era of amazing gaming, well, that hasn’t happened yet. By the time it does, if it does, you’ll have upgraded your VR sets and the graphics cards that power them at least once. If you grabbed a new Nvidia GPU because you thought PhysX was going to be the wave of the future for gaming experiences, sure, you got some use out of the feature — just not nearly the experience the hype train promised, way back when. I liked PhysX — still do — but it wound up being a mild improvement, not a major must-have.

This issue is not confined to GPUs. If you purchased an AMD APU because you thought HSA (Heterogeneous System Architecture) was going to introduce a new paradigm of CPU – GPU problem solving and combined processing, five years later, you’re still waiting. Capabilities like Intel’s TSX (Transaction Synchronization Extensions) were billed as eventually offering performance improvements in commercial software, though this was expected to take time to evolve. Five years later, however, it’s like the feature vanished into thin air. I can find just one recent mention of TSX being used in a consumer product. It turns out, TSX is incredibly useful for boosting the performance of the PS3 emulator RPCS3. Great! But not a reason to buy it for most people. Intel also added support for raster order views years ago, but if a game ever took advantage of them I’m not aware of it (game optimizations for Intel GPUs aren’t exactly a huge topic of discussion, generally speaking).

You might think this is an artifact of the general slowdown in new architectural improvements, but if anything the opposite is true. Back in the days when Nvidia was launching a new GPU architecture every 12 months, the chances of squeezing support into a brand-new GPU for a just-demonstrated feature was even worse. GPU performance often nearly doubled every year, which made buying a GPU in 2003 for a game that wouldn’t ship until 2004 a really stupid move. In fact, Nvidia ran into exactly this problem with Half-Life 2. When Gabe Newell stood on stage and demonstrated HL2 back in 2003, the GeForce FX crumpled like a beer can.

I’d wager this graph sold more ATI GPUs than most ad campaigns. The FX 5900 Ultra was NV’s top GPU. The Radeon 9600 was a midrange card.

Newell lied, told everyone the game would ship in the next few months, and people rushed out to buy ATI cards. Turns out the game didn’t actually ship for a year and by the time it did, Nvidia’s GeForce 6xxx family offered far more competitive performance. An entire new generation of ATI cards had also shipped, with support for PCI Express. In this case, everyone who tried to future-proof got screwed.

There’s one arguable exception to this trend that I’ll address directly: DirectX 12 and asynchronous compute. If you bought an AMD Hawaii GPU in 2012 – 2013, the advent of async compute and DX12 did deliver some performance uplift to these solutions. In this case, you could argue that the relative value of the older GPUs increased as a result.

But as refutations go, this is a weak one. First, the gains were limited to only those titles that implemented both DX12 and async compute. Second, they weren’t uniformly distributed across AMD’s entire GPU stack, and higher-end cards tended to pick up more performance than lower-end models. Third, part of the reason this happened is that AMD’s DX11 driver wasn’t multi-threaded. And fourth, the modest uptick in performance that some 28nm AMD GPUs enjoyed was neither enough to move the needle on those GPUs’ collective performance across the game industry nor sufficient to argue for their continued deployment overall relative to newer cards build on 14/16nm. (The question of how quickly a component ages, relative to the market, is related-but-distinct from whether you can future-proof a system in general).

Now, is it a great thing that AMD’s 28nm GPU customers got some love from DirectX 12 and Vulkan? Absolutely. But we can acknowledge some welcome improvements in specific titles while simultaneously recognizing the fact that only a relative handful of games have shipped with DirectX 12 or Vulkan support in the past three years. These APIs could still become the dominant method of playing games, but it won’t happen within the high-end lifespan of a 2016 GPU.

Optimizing Purchases

If you want to maximize your extracted value per dollar, don’t focus on trying to predict how performance will evolve over the next 24-48 months. Instead, focus on available performance today, in shipping software. When it comes to features and capabilities, prioritize what you’re using today over what you’ll hope to use tomorrow. Software roadmaps get delayed. Features are pushed out. Because we never know how much impact a feature will have or how much it’ll actually improve performance, base your buying decision solely on what you can test and evaluate at the moment. If you aren’t happy with the amount of performance you’ll get from an update today, don’t buy the product until you are.

Second, understand how companies price and which features are the expensive ones. This obviously varies from company to company and market to market, but there’s no substitute for it. In the low-end and midrange GPU space, both AMD and Nvidia tend to increase pricing linearly alongside performance. A GPU that offers 10 percent more performance is typically 10 percent more expensive. At the high end, this changes, and a 10 percent performance improvement might cost 20 percent more money. As new generations appear and the next generation’s premium performance becomes the current generation’s midrange, the cost of that performance drops. The GTX 1060 and GTX 980 are an excellent example of how a midrange GPU can hit the performance target of the previous high-end card for significantly less money less than two years later.

Third, watch product cycles and time your purchasing accordingly. Sometimes, the newly inexpensive last generation product is the best deal in town. Sometimes, it’s worth stepping up to the newer hardware at the same or slightly higher price. Even the two-step upgrade process I explicitly declared wasn’t future-proofing can run into trouble if you don’t pay close attention to market trends. Anybody who paid $1,700 for a Core i7-6950X in February 2017 probably wasn’t thrilled when the Core i9-7900X dropped with higher performance and the same 10 cores a few months later for just $999, to say nothing of the hole Threadripper blew in Intel’s HEDT product family by offering 16 cores instead of 10 at the same price.

Finally, remember this fact: It is the literal job of a company’s marketing department to convince you that new features are both overwhelmingly awesome and incredibly important for you to own right now. In real life, these things are messier and they tend to take longer. Given the relatively slow pace of hardware replacement these days, it’s not unusual for it to take 3-5 years before new capabilities are widespread enough for developers to treat them as the default option. You can avoid that disappointment by buying the performance and features you need and can get today, not what you want and hope for tomorrow.

Update (7/21/2021):

It’s now been over two years since we first wrote this piece, although we’ve updated it in the interim, and since it’s an article about future-proofing, it makes thematic sense to return to our own conclusions and survey whether things have changed.

They have not. That’s before we take into consideration the ruinous impact of high prices on the retail PC industry. Right now, chances are good that you’ll pay far more for any GPU than you would have a year ago, and the idea of future proofing under these conditions is impossible. The best way to future proof yourself at the moment is to buy as little hardware as you can get away with, at least as far as graphics are concerned, and wait for prices to come down.

Nvidia’s RTX 3000 series, which launched in the fall of 2020, dramatically improved ray tracing performance and performance per dollar compared with Nvidia’s previous generation of Turing cards. When Nvidia launched its Turing architecture, it argued that buying into the GPU family now would unlock a gorgeous future of ray-traced games. In reality, only a modest handful of games shipped with ray tracing support during Turing’s life, and Ampere’s strongest gains over Turing are often in ray-traced games.

There is nothing wrong with buying an expensive GPU because you want the best card. There’s nothing wrong with choosing to buy in at the top of a market because you want the best performance possible and are willing to sacrifice performance per dollar to reach a given performance target. Anyone who bought an RTX 2080 Ti to make certain they’d be able to play ray-traced titles as long as possible would have done better to hold on to a Pascal-era GPU in 2018 and then buy Ampere in 2021 – 2022, assuming availability has improved by then.

Now Read:

July 23rd 2021, 10:23 am

EA Announces Dead Space Remake

ExtremeTech

EA announced it’s remaking the original Dead Space. I’ve actually been playing Dead Space 2 (again) in the last few weeks, so this unveil feels quite timely. The game is being developed by Motive Games, though information on the project is currently limited.

EA’s Dead Space trilogy — particularly the first two — are some of the finest single-player games ever made. They blend horror and science fiction in an alternate universe where mankind developed the ability to crack open planets and harvest their resources to feed its own ever-growing need for raw materials. You play as Isaac Clarke, an engineer sent to investigate the fate of the oldest planet-cracker in the fleet, the USG Ishimura. Isaac is named for two of science fiction’s great authors, and the fact that he’s an engineer instead of a soldier is woven through the game. Many of Isaac’s weapons are converted mining equipment and you spend a fair bit of time trying to keep the ship from falling apart.

All is not well aboard the Ishimura, and the first game sees you work to uncover what happened to the ship and its crew. Isaac travels the length of the ship in his efforts to repair the stricken vessel. We learn that the planet-cracker is one of several ships and that humanity now suffers from critical resource shortages that planet-cracking helps to alleviate, despite its tremendous cost.

Here’s how EA describes it: “The sci-fi survival horror classic Dead Space returns, completely rebuilt from the ground up by Motive Studios to offer a deeper and more immersive experience. Harnessing the power of the Frostbite game engine and next generation consoles, this remake brings jaw-dropping visual fidelity and improvements to gameplay while staying true to the original.”

There are a lot of places where one could improve on the original. The original Dead Space runs at a locked 30fps and feels like it. The second game runs at 30fps but feels markedly smoother. The third game’s controls and camera feel downright fluid compared with the first, and I hope any effort to remaster the game doesn’t include faithfully replicating its control scheme, especially not on PC.

Hopefully, at some point, we’ll see a true Dead Space sequel. But a remake of the first game is a pretty great alternative. If it sells well, the second and third games will likely follow. Dead Space 2 takes everything great about the first game and polishes it to a mirror shine, so as exciting as it is to see Dead Space announced, the true gem will be its sequel — at least, assuming the remastering process goes well.

One question: Will Isaac be the silent protagonist (as he was in the original Dead Space) or will Motive Studios add dialog? I’m guessing the former, but I wouldn’t mind it if companies were willing to take a few chances on changing old games when they head to the remastering well. While it would change the flow of the game, offering the player the option to switch between voiced and unvoiced mode (and accepting somewhat changed dialog, since Isaac’s ability to respond would obviously change conversations), it would be an interesting way to tune the experience of the original game while remaining true to the series.

Now Read:

July 23rd 2021, 9:53 am

Astronomers Detect Moon-Forming Disc Around Another Planet

ExtremeTech

The sky is full of stars, and for the first time in human history, we know what’s orbiting some of them. There are more than 4,000 confirmed exoplanets, but just a few have been directly imaged. A young gas giant called PDS 70c is one of them, and now astronomers using the Atacama Large Millimetre/submillimeter Array (ALMA) have spotted something new: a moon-forming disc around the exoplanet. 

Most of the known exoplanets were discovered via the transit method, which is the basis for NASA’s (Transiting Exoplanet Survey Satellite) TESS instrument and the dearly departed Kepler Space Telescope. This way of spotting exoplanets watches for dips in luminance as the planets transit in front of the star. The PDS 70 system is not positioned in such a way that we can use the transit method, but that’s rather lucky. Its orientation allowed the European Southern Observatory’s Very Large Telescope (VLT) to look down on the planetary disk and pick out PDS 70c and 70b with a high level of detail. 

Both of the exoplanets in this solar system are very young. The star itself is only a few million years old. The opportunity to watch these gas giants form is extremely valuable to science, which is why ALMA was scanning it in the first place. There were hints of a moon-forming region around PDS 70c, but we didn’t have confirmation until now.

As you can see in the image above, the bright point inside the protoplanetary disc (zoomed on the right) is PDS 70c. That haze around it is material that has become caught in its gravity, of which it has plenty. PDS 70c is still forming and it’s already about four times as massive as Jupiter. 

We don’t know how much of this disc will eventually fall into the planet, but the ALMA data shows there’s a great deal of matter floating around. The disc has a diameter of about 1 AU (the distance between Earth and the sun), and there’s enough mass to form three satellites the size of Earth’s moon. Our understanding of how planets and moons form is still rudimentary, but further observations of the PDS 70 system could answer a lot of questions. 

PDS 70 is only 370 light-years away, which is right next door in galactic terms. This makes it an ideal observational target for upcoming instruments like the ESO’s Extremely Large Telescope (ELT), which is being built in Chile near ALMA. NASA also hopes to have the James Webb Space Telescope launched by the end of the year. Both these instruments will have the ability to look deeper into the infrared to probe the moon-forming ring of dust.

Now read:

July 23rd 2021, 8:53 am

Laptop Integrated Graphics Are Still Marginal for Modern Gaming

ExtremeTech

Red Dead Redemption 2 benchmark on the Ryzen 7-based Asus ZenBook 13 (Photo: Molly Flores)

(Photo: Molly Flores)
Laptops have fewer upgrade opportunities than desktops, so the question of whether your laptop’s built-in GPU can handle gaming makes a big difference as to how useful the system is.

Our colleagues at PCMag have tested multiple laptops representing several generations of Intel and AMD iGPUs, including Intel’s Skylake-era Intel UHD Graphics 620, Intel Xe, and two different AMD APUs, the 5700U and the 5800U in a number of games, including AAA titles, esports, and simulation and strategy games.

What they found, in aggregate, is that Tiger Lake and Intel Xe-equipped systems are generally the fastest for gaming, with AMD’s Ryzen 7 5800U and 5700U not far behind. The two solutions trade blows in several games, but the TGL-equipped Xe is faster than the integrated Radeon solutions more often than not. There’s some interesting data in their findings, including the impact of your system chassis and thermal dissipation on system performance.

In this case, the HP Envy x360 15, with the Ryzen 7 5700U, is often faster than the Asus Zenbook 13, which is equipped with a Ryzen 5800U. In Borderlands 3, the 5700U is 1.21x faster than the Zenbook 13. It’s 1.17x faster in 1080p Medium detail in Strange Brigade. There are multiple tests where the two chips tie, but the 5800U only wins a single test: Civilization VI.

Buyers who are considering 13-inch laptops versus 15-inch laptops should be aware that a smaller chassis can effectively cancel out the point of buying a high-end chip. It is not unusual for the midrange version of a laptop to offer better sustained performance than the high-end model.

1080p Isn’t a Realistic Resolution for AAA Gaming

Right now, none of the laptops with integrated graphics currently in-market can handle 1080p AAA titles very well. Games such as Far Cry 5, Borderlands 3, Strange Brigade, Assassin’s Creed: Valhalla, and Red Dead Redemption 2 range from completely unplayable on all tested systems to squeaking out a playable frame rate in 1080p Low. Strange Brigade was the single exception. Both Ryzen systems and the Xe-powered Tiger Lake turned in 39fps average frame rates or higher.

AMD has talked up 1080p as a target resolution for integrated gaming, but that’s not happening with AAA titles at the moment. We think of 720p and 1080p as similar standards because they debuted at the same time, but 1920 by 1080 is 2.073MP. 1280 by 720 is just 0.922MP. 1080p gaming requires the GPU to process 2.2x more pixels per frame to maintain a playable frame rate.

PCMag doesn’t make this recommendation in its own article, but I’m going to: If you are questioning whether you can game on any laptop without a discrete GPU, don’t be afraid to target 720p instead of 1080p as your baseline resolution. Gaming on an integrated GPU means sacrificing detail for performance, and dropping from 1080p to 720p is a meaningful step downwards for your GPU. There’s a reason both the Switch and the Steam Deck target this resolution. The old Intel Skylake solutions may still not offer enough horsepower to run games, but Xe and Radeon would both benefit.

This problem will improve somewhat once DDR5 arrives. If AMD and Intel support DDR5-6400 when they launch iGPUs for the standard, AMD and Intel would deliver 102.4GB/s of (shared) memory bandwidth. That’s not much by GPU standards–the Radeon HD 4870 was the last top-end GPU to offer this kind of performance, and even it was a bit faster, at 115GB/s of GDDR5 bandwidth. Modern GPUs are vastly more efficient than the old Radeon HD 4000 family, however, which is part of how they’ve been able to keep improving despite relatively low bandwidth.

The situation outside of AAA gaming looks better. The Skylake-era Intel integrated systems are still outclassed–anyone attempting to game on a Skylake-era system might need to try targeting a resolution like 960×540–but the Xe and Ryzen APUs put on a good show. The Elitebook PCMag tested based on the Intel Core i7-10810U is playable in Civilization VI at low detail, but nowhere else. Warcraft Classic would probably run, though it’s not included here.

The big takeaway from these results is that 1080p is a marginal resolution for integrated laptop gaming. If you play older games and non-demanding games, 1080p is realistic. If you want to play anything modern or more than mildly demanding, it isn’t. This will improve somewhat when AMD deploys RDNA2 on-silicon (especially if it comes with a new integrated L3), or if either Intel or AMD starts integrating a feature like HBM2 on-die. For now, anyone who wants to do any integrated gaming should pick up something in the Ryzen 4000, 5000, or Intel Tiger Lake family. Older chips are at a disadvantage compared with these.

Now Read:

July 22nd 2021, 6:46 pm

How USB Charging Works, or How to Avoid Blowing Up Your Phone

ExtremeTech

OLYMPUS DIGITAL CAMERA

The tech world has finally coalesced around a charging standard, after years of proprietary adapters and ugly wall-wart power supplies. USB-C is in the process of replacing them. But, in a brilliant example of getting exactly what we asked for and not at all what we wanted, USB-C has solved the wall-wart problem by transplanting it. Instead of running around with a collection of manufacturer-specific wall plugs, customers with multiple USB-C devices have to run around with a manufacturer-specific or genuinely compatible third-party cable.

Ten to 15 years ago, you always had to make sure you had the correct power supply for each of your gadgets. Now you have to make sure you have the right cable for your gadget.  Part of the reason for this problem is another feature that was buffed in USB-C to make it more effective and flexible: USB power delivery.

Not all USB chargers, connectors, and cables are born equal. You’ve probably noticed that some wall chargers are stronger than others. Sometimes, one USB socket on a laptop is seemingly more powerful than the other. On some desktop PCs, even when they’re turned off, you can charge your smartphone via a USB socket. It turns out there’s a method to all this madness — but first, we have to explain how USB power actually works.

New Specifications

There are now six USB specifications — USB 1.0, 2.0, 3.0, 3.1, 3.2, and USB4. USB 1.0 is old enough that you’ll virtually never see it, so USB2-USB4 covers the spectrum for most people. Separately from that, there’s USB-C, which is a physical connection standard that devices can use. In a USB network, there is one host and one device. In almost every case, your PC is the host, and your smartphone, tablet, or camera is the device. Power always flows from the host to the device, although data can flow in both directions, such as when you copy files back and forth between your computer and your phone. The descriptions below apply to all versions of USB currently in use, including USB4,  3.0, 3.1, 3.2, and their various sub-standards (1×2, 2×2, etc).

What’s New About USB4?

USB4 requires USB-C, while previous standards merely made it an option. Support for USB Power Delivery is now mandatory and the standard now allows for DisplayPort and PCI Express tunneling. Nominal data rates on USB4 are 20Gbps and 40Gbps. Up to DisplayPort 2.0 is supported via alt mode, DP1.4 is supported via tunneling.

As for how USB4 performance compares against other USB standards, we’re genuinely sorry, but that’s a bit confusing at the moment.

I really don’t have a defense for this, so let’s accept that bad things happen to good people and move on.

Unlike previous generations of USB, USB4 is based directly on the Thunderbolt 3 specification. This does not present any kind of problem for AMD or other companies that use USB4. To the end-user, the switch to Thunderbolt should be seamless and unnoticeable.

USB4 doesn’t always add maximum top-end bandwidth, but it should make it easier to hit higher performance figures. It moves us towards USB-C as a universal standard and it offers some additional protocol support and tunneling features that weren’t available earlier. It’s not going to be revolutionary, but USB4 Gen 3×2 devices will be significantly faster than what has shipped before.

USB 3.2 and Earlier

A regular USB 1.0 or 2.0 socket has four pins, and a USB cable has four wires. The inside pins carry data (D+ and D-), and the outside pins provide a 5-volt power supply. USB 3.0 ports add an additional row of five pins, so USB 3.0-compatible cables have nine wires. In terms of actual current (milliamps or mA), there are three kinds of USB ports dictated by the current specs: a standard downstream port, a charging downstream port, and a dedicated charging port. The first two can be found on your computer (and should be labeled as such), and the third kind applies to “dumb” wall chargers.

In the USB 1.0 and 2.0 specs, a standard downstream port is capable of delivering up to 500mA (0.5A); with USB 3.0, it moves up to 900mA (0.9A). The charging downstream and dedicated charging ports provide up to 1,500mA (1.5A). USB 3.1 bumps throughput to 10Gbps in what’s called SuperSpeed+ mode, bringing it roughly equivalent to first-generation Thunderbolt. It also supports a power draw of 1.5A and 3A over the 5V bus. USB 3.2 does not change these aspects of the standard.

USB-C is a different connector entirely. It’s universal; you can put it in either way and it will work, unlike with USB, and like Apple’s Lightning connector. USB-C is also capable of twice the theoretical throughput of USB 3.0 and can output more power. Apple joined USB-C with USB 3.1 back in 2015 with its 12-inch MacBook and new MacBook Pros, and phones soon followed. But there can also be older-style USB ports that support the 3.1 standard.

The USB spec also allows for a “sleep-and-charge” port, which is where the USB ports on a powered-down computer remain active. You may have noticed this on your desktop PC, where there’s always some power flowing through the motherboard, but some laptops are also capable of sleep-and-charge.

Now, this is what the spec dictates. But there are plenty of USB chargers that don’t conform to these specs — mostly of the wall-wart variety. Apple’s iPad charger, for example, provides 2.1A at 5V; Amazon’s Kindle Fire charger outputs 1.8A; and many car chargers can output anything from 1A to 2.1A.

Can I Blow Up My USB Device?

There is a huge variance, then, between normal USB ports rated at 500mA, and dedicated charging ports, which range all the way up to 3,000mA. This leads to an important question: If you take a phone which came with a 900mA wall charger, and plug it into a 2,100mA iPad charger, as an example, will it blow up?

In short, no: You can plug any USB device into any USB cable and into any USB port, and nothing will explode — and in fact, using a more powerful charger should speed up battery charging.

The longer answer is that the age of your device plays an important role, dictating both how fast it can be charged, and whether it can be charged using a wall charger at all. Way back in 2007, the USB Implementers Forum released the Battery Charging Specification, which standardized faster ways of charging USB devices, either by pumping more amps through your PC’s USB ports or by using a wall charger. Shortly thereafter, USB devices that implemented this spec started to arrive.

If you have a modern USB device, you should be able to plug into a high-amperage USB port and enjoy faster charging. If you have an older product, however, it probably won’t work with USB ports that employ the Battery Charging Specification. It might only work with old-school, original (500mA) USB 1.0 and 2.0 PC ports. In some (much older) cases, USB devices can only be charged by computers with specific drivers installed, but this is now going back nearly two decades.

What About USB-C?

USB-C is a special case. While you won’t blow up your device from plugging in the wrong charger, you can blow up your phone, Nintendo Switch, or other device by using the wrong USB-C cable. How do you know what the right USB-C cable is? Sometimes — and this is the ugly truth — you can’t. In the past, there’ve been spreadsheets dedicated to recording good versus bad cables, but the projects seem to have fallen by the wayside and are now outdated. If you are buying a replacement USB-C cable for your manufacturer-provided cable, we recommend buying from the OEM or an authorized, third-party manufacturer. We covered the initial issues with USB-C in more detail in this article.

What if you already have a third-party cable or don’t have any option but to use one? Even if it works, slow charging speed could be a problem. This is preferable, in the sense that it won’t actually damage any of your hardware. In many cases, your device will only charge at minimum or near-minimum speed as opposed to the fast charging options now supported by many devices. This chart from Android Authority shows how charge speed varies depending on which device you use.

Keep in mind, however, that the consequences can be worse than just slow charging. There are cases of people blowing up devices with the wrong type of cable, and other instances where one device may flatly refuse to work with another device’s USB-C cable for unknown reasons. The assumption, in this case, is that the device is refusing to connect through a given cable to protect itself.

Quick Charging and Other Notes

Many new phones offer some kind of quick-charge capability — often with variable names, the most common being Qualcomm’s Quick Charge that works with Snapdragon-powered phones. The standard evolved quickly and is moving to version 5 at the time of this writing. For quick charging to work, usually you’ll need to use either the power adapter that came with the phone or an appropriately labeled third-party adapter. Otherwise, it will still take several hours to juice up. Quick Charge seems to be fading out of the market with the advent of USB Power Delivery (USB PD), but both capabilities are still supported.

There are a few other things to be aware of. While PCs can have two kinds of USB ports — standard downstream or charging downstream — OEMs haven’t always labeled them as such. As a result, you might have a device that charges from one port on your laptop, but not from the other. This is a trait of older computers, as there doesn’t seem to be a reason why standard downstream ports would be used when high-amperage charging ports are available. Most vendors now put a small lightning icon above the proper charging port on laptops, and in some cases, those ports can even stay on when the lid is closed. You may have to check your motherboard or system documentation to determine which USB ports provide the full capabilities you want. A system might only support charging, video playback, or Ethernet over a single USB-C port, even if it has two to four USB-C ports in total.

Now Read:

Check out our ExtremeTech Explains series for more in-depth coverage of today’s hottest tech topics.

Sebastian Anthony wrote the original version of this article. It has since been updated several times with new information.

July 22nd 2021, 6:46 pm

ET Deals: $150 Off Apple MacBook Air M1 Chip, Nearly $600 Off Dell Vostro 15 7500 Core i7 Laptop

ExtremeTech

Apple’s MacBook Air laptops equipped with the company’s revolutionary M1 chip offer excellent performance for a wide range of tasks, and you can get one now with an $150 discount.

Apple MacBook Air M1 Chip 13.3-Inch Laptop w/ 8GB RAM and 512GB SSD ($1,099.99)

Apple’s latest MacBook Air comes equipped with Apple’s new M1 SoC, which contains an 8-core processor that’s reportedly 3.5 times faster than the hardware inside of the preceding model. Apple said the system can also last for up to 18 hours on a single charge, giving you all the power you need to work from sunrise to sunset. Now for a limited time you can get one of these systems from Amazon marked down from $1,249.00 to just $1,099.99.

Dell Vostro 15 7500 Intel Core i7-10750H 15.6-Inch 1080p Laptop w/ Nvidia GeForce GTX 1650 Ti GPU, 8GB DDR4 RAM and 256GB NVMe SSD ($1,349.00)

The new Vostro 15 7500 laptop is a true jack-of-all-trades. Dell’s Vostro systems are oriented as business solutions, and this system is no different, but it also has fairly strong gaming capabilities. Its 100 percent sRGB display is also well suited for editing images. No matter what you need a laptop for, this system should fit the bill. Currently, you can get this system from Dell with a hefty discount that drops the price from $1,927.15 to just $1,349.00.

Western Digital My Passport 1TB External SSD ($145.26)

This external drive utilizes SSD technology to enable relatively fast data transfer speeds of up to 1,050MB/s over a USB 3.2 Gen 2 connection. It can also hold up to 1TB of data and it has built-in security features including the ability to set a password on the drive and 256-bit AES hardware encryption. The drive also has a durable metal enclosure as well as a small physical footprint and can be picked up from Amazon marked down from $199.99 to just $145.26.

Netgear Nighthawk R6700 802.11ac AC1750 Smart WiFi Router ($79.99)

The Nighthawk R6700 is one of the most popular Wi-Fi routers on the market. It offers reliable performance with speeds of up to 1,750Mbps across two bands. It also has built-in USB ports for adding network resources. Right now it’s marked down from Newegg from $99.99 to $79.99.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 22nd 2021, 6:46 pm

How to Boost Your Wi-Fi Speed by Choosing the Right Channel

ExtremeTech

Wireless networks have come a long way in the past couple of decades. And yet, sustained Wi-Fi speeds are still a vexing problem in a lot of situations. A number of subtle factors can come into play, including the way your router is set up, the presence of nearby interference, whether you live in an apartment building or a stand-alone house, where your microwave sits in relation to the rest of your network, and how far your devices are from the router. Fortunately, there’s almost always a way to fix slow transfer speeds.

If you’ve ever messed around with your Wi-Fi router’s settings, you’ve probably seen the word “channel.” Most routers have the channel set to Auto. But many of us have looked through that list of a dozen or so channels and wondered what they are, and more importantly, which of the channels are faster than the others. Well, some channels are indeed much faster — but that doesn’t mean you should go ahead and change them just yet.

The fastest version of Wi-Fi currently available is branded as “Wi-Fi 6E,” aka 802.11ax-2021. If you’re wondering why we moved to branded naming as opposed to the standard number + a signifying letter combination, it’s because there are a lot more low-level updates and specification changes to 802.11 than there used to be. After 802.11ac in 2013, we’ve had .ad, .af, 802.11-2016, .ah, .ai, .aj, and .aq. Rather than asking people to continue playing “Guess the relevant Wi-Fi standard,” somebody decided it would be easier to just call the current major consumer version “Wi-Fi 6.” There’s an even newer standard, Wi-Fi 6E, which supports signals in the 5-6GHz band, but the Wi-Fi 6E hardware in-market is a rather poor deal at present.

Channels 1, 6, and 11

First of all, let’s talk about 2.4GHz, because even in 2021, the majority of Wi-Fi installations still use the 2.4GHz band in some way. 802.11ac, which debuted in 2013, is driving adoption of 5GHz, helped along by adoption of 2020’s 802.11ax / Wi-Fi 6 — but thanks to backwards compatibility, dual-radio routers and devices, and lower-cost peripherals with less expensive chipsets, 2.4GHz will continue to reign for a while.

All versions of Wi-Fi up to and including 802.11n (a, b, g, n) operate between the frequencies of 2400 and 2500MHz. These 100MHz are separated into 14 channels of 20MHz each. As you’ve probably worked out, 14 lots of 20MHz is a lot more than 100MHz — and as a result, every 2.4GHz channel overlaps with at least two, if not four, other channels (see diagram above). And as you can probably imagine, using overlapping channels is bad — in fact, it’s the primary reason for poor throughput on your wireless network.

Fortunately, channels 1, 6, and 11 are spaced far enough apart that they don’t overlap. On a non-MIMO setup (i.e. 802.11 a, b, or g) you should always try to use channel 1, 6, or 11. If you use 802.11n with 20MHz channels, stick to channels 1, 6, and 11 — if you want to use 40MHz channels, be aware that the airwaves might be congested, unless you live in a detached house in the middle of nowhere.

What channel should you use in a crowded area?

If you want maximum throughput and minimal interference, channels 1, 6, and 11 are your best choices. But depending on other wireless networks in your vicinity, one of those channels might be a better option than the others.

For example, if you’re using channel 1, but someone next door is annoyingly using channel 2, then your throughput will plummet. In that situation, you would have to change to channel 11 to completely avoid the interference (though 6 would be pretty good as well). It might be tempting to use a channel other than 1, 6, or 11 — but remember that you will then be the cause of interference (and everyone on 1, 6, and 11 will stomp on your throughput, anyway).

In an ideal world, you would talk to your neighbors and get every router to use channels 1, 6, or 11. Bear in mind that interior walls do a pretty good job of attenuating (weakening) a signal. If there’s a brick wall between you and a neighbor, you could probably both use channel 1 without interfering with each other. But if it’s a thin wall (or there’s lots of windows), you should use different channels.

There are tools that can help you find the clearest channel, such as Vistumbler. But it’s probably faster to just switch between channels 1, 6, and 11 until you find one that works well. (If you have two laptops, you can copy a file between them to test the throughput of each channel.)

But what about 5GHz?

Get ready for lots of antennas.

The great thing about 5GHz (802.11n, 802.11ac, and Wi-Fi 6) is that because there’s much more free space at the higher frequencies, it offers 23 non-overlapping 20MHz channels. 6GHz should continue this trend, with even more frequency space (although with slightly worse propagation characteristics).

Starting with 802.11n and continuing with 802.11ac, wireless technology in general became much more advanced than the prehistoric days of 802.11b and g. If you own at least a decent 802.11n or 802.11ac router (i.e. if you bought a router in the last several years), it likely has some hardware inside that chooses the right channel automatically and modifies the output power to maximize throughput and minimize interference.

If you’re using the 5GHz band, and your walls aren’t paper-thin, then attenuation and the general lack of 5GHz devices should mean there’s little interference in your apartment — possibly even allowing you to use the fatter 40, 80, and 160MHz channels if you feel like it.

Eventually, as everyone upgrades to newer hardware and moves towards 5GHz, picking the right channel will mostly become a thing of the past. There may still be some cases where it makes sense to fine-tune your router’s channel selection. But when you’re dealing with MIMO setups (up to eight in 802.11ac), it’s generally a better idea to let your router do its own thing. Eventually, of course, 5GHz will fill up as well — but hopefully by then, we’ll have worked out how to use even higher frequencies (60GHz WiGig) or entirely new antenna designs (pCells, infinite capacity vortex beams) to cope with our wireless networking demands.

WiFi 6E will eventually change some of this guidance, but right now the only WiFi 6E devices are poor deals. Once Wi-Fi 6E hardware is available, the 6GHz band is expected to create opportunities for clearer networking since it’s unused spectrum at the moment. Total performance of WiFi 6E is not higher than WiFi 6, but it should be easier to hit that throughput in real-world scenarios.

Now Read:

Sebastian Anthony wrote the original version of this article. It has since been updated with new information.

July 22nd 2021, 6:46 pm

Mysterious Record-Breaking Comet Has Now Grown a Tail

ExtremeTech

Astronomers announced earlier this year that an enormous 62-mile (100-kilometer) object was creeping inward from the fringes of the solar system. We didn’t know what this mysterious visitor was at the time, but a recent development has shed light on the subject. The object, now officially designated C/2014 UN271, began forming a coma or tail in recent weeks. That confirms it’s a comet, and it may be the largest one ever discovered. 

UN271 first appeared in astronomical observations in 2014, but it wasn’t until June of this year that scientists were able to identify it in the old data. Because it was (and still is) so far away, detecting the comet was a computationally expensive problem. It’s a bit more visible now, and it will become more so as it slides closer to the sun. 

The data released earlier this summer came from observations made in 2020 when the object was about 29 AU away. That put it around the orbit of Neptune. Following the announcement, astronomers rushed to get updated images of UN271. The latest data from the Las Cumbres Observatory in South Africa shows a coma around C/2014 UN271 at its current location 19 AU away. 

In the image above from the Las Cumbres Observatory, you can see UN271 in the center. The bright spot in the center is the comet’s record-breaking nucleus. The haze around that point of light is the coma, composed of material vaporized by exposure to sunlight. You can compare the fuzzy appearance of UN271 with the star visible just above it, which has sharp edges and no haze. That’s a dead giveaway for a comet. 

Further observations have also helped to nail down the comet’s orbit. Astronomers now estimate that UN271’s orbit takes it as much as 40,000 AU away from Earth. That’s almost 6 trillion kilometers or 0.63 light-years, which is well outside the pocket of space we think of as our solar system and smack in the middle of the Oort Cloud. This theoretical cloud of frozen material is believed to be left over from the solar system’s formation. When something gets knocked out of orbit in the Oort Cloud, it becomes what we know as a comet. 

We also know that C/2014 UN271 will continue getting closer until 2031, at which time it will start drifting back out into deep space. At its closest approach, it will be 10.9 AU distant — about at the orbit of Saturn. The coma will make it much brighter, but still not bright enough to be seen by the unaided eye. However, you can bet telescopes around the globe will be trained on this gargantuan comet.

Now read:

July 22nd 2021, 6:46 pm

Report: Blizzard Knew Warcraft 3 Was a Mess and Launched It Anyway

ExtremeTech

Warcraft 3 was hailed as a smashing success when it launched way back in 2002. It was one of the first truly modern real-time strategy titles, and it made meaningful improvements over Warcraft 1 and 2. It was also easy to mod and play online — the entire MOBA genre grew out of modded Warcraft 3. Even with all that going for it, the game was looking old and busted when Blizzard announced a remastered version in 2018. Fans were hyped, but the launch did not live up to expectations. According to a newly leaked document, Blizzard knew the game was a mess, but it released it anyway. 

According to a report from Bloomberg, Warcraft 3 was doomed by mismanagement and financial pressure from Blizzard’s parent company, Activision. Warcraft 3: Reforged launched in January 2020, and it was clear out of the gate that so much of the promised content was missing — the updated cutscenes were nowhere to be seen, and Blizzard dropped its plans to record new dialog. Even the online ranking ladder was gone. Fans started demanding refunds with such frequency that Blizzard started offering instant refunds. There are a lot of parallels to the Cyberpunk 2077 fiasco, but this was a remake that was never expected to make a boatload of money. That, as it turns out, was part of the reason the project was so hobbled. 

The leaked documents from an internal “postmortem” of the launch called out Blizzard’s decision to keep accepting pre-orders as the project unraveled. “We took pre-orders when we knew the game wasn’t ready yet,” they said. The files also recommend that Blizzard should resist the urge to ship games before they are done in the future. Gee, you think?

Apparently, a big part of the push to get Warcraft 3: Reforged out the door was coming from Activision management. They didn’t see Warcraft 3 as something that would become a billion-dollar product, which is probably true. However, Activision pressured Blizzard to focus on its flagship franchises that were making tons of money like Overwatch and Diablo IV. The Warcraft project reportedly never got the funding it needed, leading developers to cope with “exhaustion, anxiety, depression and more for a year now.” Again, this is a direct quote from Blizzard’s analysis of the project. 

Years back, this all would have come as a surprise. Blizzard was seen as one of the most reliable and customer-focused developers. However, times change and Blizzard has been accused of more than making bad games. It’s currently the target of a lawsuit in California that accuses the company of fostering a “frat boy” culture in which female employees are harassed and underpaid. The Classic Games team that worked on Warcraft 3: Reforged has since been disbanded, but Blizzard promises ongoing support. It’s hard to trust that Blizzard will get it right this time, though.

Now read:

July 22nd 2021, 6:46 pm

Save Up to 61 Percent On These HD and 4K Camera Drones

ExtremeTech

The perspective you can get from using a drone to explore the world and take photos is incredible. Aerial photography has expanded rapidly since the invention of personal drones; here’s a selection of camera-equipped drones so you can see the world from another perspective.

Micro Drone 3.0 – Combo Pack, on sale for $129.97 (39 percent off)

This clever and small drone can reach speeds of 45mph, and is capable of flips and upside-down flight. It’s equipped with a camera offering viewing of 720p x 1280 HD, or storing to a micro SD card (not included).

SHIFT RED: HI-LITE Package, on sale for $185.95 when you use coupon code SHIFTRED15

This portable, compact quadcopter drone has vision recognition and sensors for optimal flight experiences – both for experienced drone pilots and beginners. It has a 4.1-star rating out of five on Amazon and features eight different auto-pilot modes.

Global Drone 4K Platinum Version, on sale for $109.95 (8 percent off)

Use this drone’s three-level flight speed to explore the air and capture better images on its 4K HD camera. It also has an altitude-hold mode and 360-degree roll and flip technology.

DIY Building Block STEM Drone, on sale for $49.99 (61 percent off)

Help your kids learn about aerodynamics and load balancing with this build-your-own drone kit. The completed device can do 360-degree stunt flips and lets you create your own design based on instructions that are easy to follow.

M9 Mini Foldable Drone, on sale for $74.95 (58 percent off)

This mini drone folds up to fit in your hand, and is equipped with a 1080P ultra HD camera so you can catch the best aerial photography. It has an altitude-hold mode, a powerful 500mAh intelligent battery, and four sturdy protective guards.

Ninja Dragon Vortex 9 RC Quadcopter Drone with 4K HD Camera, on sale for $79.99 (46 percent off)

See real-time images from this quadcopter drone’s built-in camera in 4K HD quality. It also comes with a companion Wi-Fi app, and six-axis gyroscope for smoother flying and easier control

Copernicus Mini Drone, on sale for $74.95 (17 percent off)

The Copernicus mini drone helps you capture beautiful imagery from above. It can take off and land with the touch of one button and has a 120-degree wide-angle camera built in.

Ninja Dragon Dual 4K Wide Angle 3D Flip Quadcopter, on sale for $99 (41 percent off)

This drove features two HD wide-angle cameras for incredibly HD images. The quadcopter drone is also equipped with Wi-Fi FPV, way-point fly, headless mode, and easy one key return.

Ninja Dragon J10X Wi-Fi RC Quadcopter Drone with 1080p Wide-Angle HD Camera, on sale for $99 (50 percent off)

Instantly view HD video from this quadcopter drone, which has Wi-Fi built in for easy connection. It’s 1080p camera provides a wide range of HD pictures, and it features a powerful 1,800mAh battery, perfect for up to 20 minutes of flying at once.

Stealth Dragon 240 Headless FPV Drone with 4K HD Camera, on sale for $119.99 (40 percent off)

Control this smart quadcopter drone through your smartphone, and see the world from above. The built-in camera is 4k HD, and it also features a one-key automatic return, so it can find its way back to you.

AIR PIX: Pocket-Sized Flying Camera, on sale for $138.95 (13 percent off)

Use this pocket drone to explore the skies and capture great imagery. It features a 12MP camera for high-resolution photos and has several flight modes: auto-fly, gesture mode, and manual mode.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 22nd 2021, 6:46 pm

Valve Will Replace Big Picture Mode With New Steam Deck UI

ExtremeTech

Valve’s Steam Deck announcement earlier this week has made waves, but underneath lies a second story: the company plans to replace Steam’s Big Picture UI with the Steam Deck interface.

In response to a Steam forum post calling Valve’s Big Picture UI “outdated,” a Valve employee said big things may be coming for gamers familiar with the company’s large-screen interface. The employee indicated the handheld’s UI would soon be taking Big Picture’s place, although they didn’t give a release date.

PC gamers know Big Picture as Steam’s way of bringing PC titles to the (arguably comfier) couch by casting games to a larger monitor or TV. Setup is simple and just requires an HDMI cable between the PC and screen. Once connected, Big Picture’s UI takes over and provides a TV-friendly experience—one that doesn’t punish gamers for using a controller versus a mouse and offers a unique “Daisywheel,” Steam’s D-pad alternative to a QWERTY keyboard. Designed to be viewed from 10 feet away, Big Picture’s fonts and overall interface help to make PC gaming accessible and comfortable for those who get tired of their desk setup.

But Big Picture has been around for nearly a decade, having first been introduced in late 2012. Though useful, the interface hasn’t received a major facelift since its launch. It’s time for a refresh, and Valve apparently sees the Steam Deck as an opportunity to kill two birds with one stone: release a competitor to the ever-popular Nintendo Switch and update the client’s tired big-screen UI. It’s a welcome refresh to those who prefer couch gaming but find the interface a little clunky.

Some are already finding the Steam Deck UI to be more “approachable” than Big Picture. According to our colleague Tom Marks, who received a sneak peek of the Deck, it’s “easier to both search out specific games and navigate more generally,” and he said feels more like a modern console UI than what came before it.

Valve has its work cut out for it in blending a full-fledged PC with a handheld console. But breathe easy–whether or not you’ve pledged to spend $400+ come December (or later), there are already some exciting developments on the horizon.

Now Read:

July 22nd 2021, 6:46 pm

Mars Rover Prepares to Collect First Sample for Return to Earth

ExtremeTech

NASA’s Perseverance rover has been playing second fiddle to the Ingenuity helicopter in recent months. But in fairness, it’s a helicopter on Mars. Not to be outdone, Perseverance is gearing up for the next big step in its mission on the red planet. In the coming weeks, NASA will harvest the first of many samples that could one day return to Earth for study. First, the rover has to find just the right bit of rock to carve up. 

The robot has an impressive suite of tools that can analyze the geology of Mars, but there’s only so much you can do on-site. Not all instruments are compact enough to make the trip bolted to a rover, and scientists might think of additional experiments long after the train (or rocket) has left the station. That’s why bringing samples back to Earth is so desirable. However, it was no simple matter to build a system to collect those samples. 

The Sample Caching System includes 43 metal tubes that slide into the rover’s coring bit. The sample tubes are the cleanest things ever sent into space. That was essential to ensure the samples were not contaminated with materials from Earth. NASA even had to back off from its original decontamination protocols because it was stripping the natural layer of hydrocarbons from the metal. That made the material sticky, causing the tubes to become stuck on the drill bit. The rotary percussive drill is capable of carving out small samples of rock about the size of a piece of chalk, which is transferred instantly to the tube. 

The current goal is to find a geologically interesting rock on the floor of Jezero Crater. Actually, the team needs two identical rocks. One rock will be cleaned and subjected to scans with SHERLOC, PIXL, and WATSON instruments. The Mastcam-Z will also get involved to snap high-resolution photos, and the SuperCam will use its laser to vaporize small bits of the sample to analyze the molecules. 

NASA techs loading the sample return tubes into Perseverance.

The second rock won’t be hammered by lasers or scraped clean on Mars; it’s coming to Earth. Before collecting the core destined for a return trip, NASA will let Perseverance sit for a full Martian Sol until it can recharge its batteries. Then, it will turn the sample caching drill on the “twin” rock. That sample tube will remain in storage along with all the others Perseverance collects. 

It will take at least two more missions to retrieve those samples — one to pick them up and get to Mars orbit, and one that takes the payload from Mars back to Earth. The planning process is just getting underway, but the tubes could be home as early as 2031. The first sample collection should begin in about two weeks.

Now read:

July 22nd 2021, 6:46 pm

ET Deals: $80 Off Apple Watch Series 6, Dell 27-Inch 1080p 144Hz Curved Gaming Monitor for $229

ExtremeTech

Today you can pick up an Apple Watch Series 6 smartwatch with a $80 discount. There’s also a deal on a Dell gaming monitor that has a fast 144Hz refresh rate for just $229.99.

Apple Watch Series 6 40mm GPS Smartwatch ($319.00)

Apple’s Series 6 smartwatch has built-in hardware for tracking your blood oxygen level and heart rate. These features as well as a built-in fitness tracker make the Watch Series 6 an excellent accessory for any exercise routine. This model is also up to 20 percent faster than its predecessor, the Watch Series 5. You can now get one of these watches from Amazon marked down from $399.00 to $319.00.

Dell S2721HGF 27-Inch 1080p 144Hz VA Curved Gaming Monitor ($229.99)

Dell’s S2721HGF gaming monitor was built with an ultra fast 144hz 1080p display panel that provides for a highly responsive gaming experience. The monitor is also curved, which helps to make games feel more immersive. Currently Dell is selling these monitors marked down from $299.99 to $229.99.

Eufy Anker RoboVac 11S Robot Vacuum ($149.99)

Eufy designed this slim Robovac with a vacuum capable of 1,300Pa of suction. This gives it the power it needs to help keep your home clean, and it can also last for up to 100 minutes on a single charge. Right now you can get one from Amazon marked down from $229.99 to just $149.99.

Roku Streaming Stick+ 4K HDR Media Player ($39.00)

Roku’s Streaming Stick+ media player features 4K and HDR video support and competes directly with Amazon’s Fire TV 4K. It’s able to stream content from numerous sources including Netflix, Hulu, Vudu, and Sling TV. Right now you can get it from Amazon marked down from $49.99 to $39.00.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 21st 2021, 5:01 pm

Record Better Gaming Videos with a Lifetime of Action! Recorder for $9.99

ExtremeTech

While many gaming platforms come with recording technology built in to their hardware or software, you’ll never get the control you need with such solutions which are available out of the box. If you’re keen to record or live-stream your performance on video games or a range of other activities on screen, installing a professional-grade recorder is far more satisfactory, giving you more options, better output, and the ultimate control. All you have to do is figure out which one you need among the many options out there.

Right now, you can get a lifetime subscription to Action! Screen & Gameplay Recorder for $9.99, a great discount of 66 percent off the full purchase price of $29, allowing you to record your activity on screen with higher precision and more options – and for a wonderful low price. Upgrade your recording mechanism for this special low price and witness the difference.

Action! is the most popular of all screen recorder and gameplay recording software. It’s an all-in-one package that allows streaming and real-time recording on a Windows desktop PC, offering superb high-definition video quality, while maintaining low CPU usage and no time limitation. Action! can record and stream the highest quality video webinars and gameplay – you only need to add your webcam. The software also allows for additions such as a watermark or logo, or other graphics and streaming overlays to make your recording unique. It even offers the capture of 4K video with audio commentary and has a drawing panel function to ensure your tutorials are clear and instructive to your viewers or students.

The highly rated software (with a rating of 4.1 stars out of five on TrustPilot) lets you connect to Twitch, Facebook, YouTube, and RTMP destinations, and also has a range of added features like green screen effects or background removal.

Lifetime subscriptions to Action! Screen & Gameplay Recorder are on sale for $9.99, meaning you can more easily record and share your gaming sessions, tutorials, and more.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 21st 2021, 5:01 pm

NASA Releases Stunning First Images From Repaired Hubble Telescope

ExtremeTech

NASA’s Hubble Space Telescope had a rough start to the summer. After more than 30 years of observing the cosmos, the telescope suffered a major computer failure in June. NASA worked on the issue over the next few weeks, arriving at a solution last week. Now, the fully restored Hubble has started doing science again. NASA has now released the first new images since the failure. 

The issues began on June 13th when the iconic spacecraft dropped into safe mode after the payload computer stopped communicating with the main computer. This system is supposed to manage all the scientific instruments, so Hubble could not continue operating without a fix. Initially, the team believed a simple memory module swap would do the trick. Hubble’s payload computer has three spares for just such an eventuality. However, the issue proved to be more difficult to track down. 

Last week, NASA finally determined it was the Power Control Unit (PCU), which lives on the Command & Data Handling (SI C&DH) unit along with the payload computer. It was an all-hands-on-deck scenario as retired Hubble engineers came back to help devise a fix. NASA was able to successfully get Hubble operational again late last week thanks to these efforts. Luckily, the SI C&DH is not the one that Hubble launched with in 1990. NASA encountered a similar failure of the original SI C&DH in 2008 and had to switch to the backup hardware. That module was replaced in 2009 during Hubble’s last servicing mission. 

ARP-MADORE2115-273 (left) and ARP-MADORE0002-503 (right) as seen by the revived Hubble.

Until the James Webb Space Telescope launches, Hubble is the only general-purpose space observatory we have. So, scientists wasted no time getting back to work. NASA released some of the first new Hubble images (above) to show everything is working normally once again. These images come from a program led by Julianne Dalcanton of the University of Washington. 

On the left is ARP-MADORE2115-273, a rare example of two galaxies in the process of merging. On the right is ARP-MADORE0002-503, a single galaxy with an unusual structure. It’s a spiral galaxy, but it’s got three arms. Most other spirals (like the Milky Way) have an even number of arms. Since these images have only just been captured, there are no color versions available at this time. It’s amazing to see this mission still producing amazing science after three decades. At this rate, it’s looking like Hubble will share the sky with Webb, at least for a while.

Now read:

July 21st 2021, 5:01 pm

Deep Space Nine Project Update: Why MakeMKV-Derived Files Don’t Work

ExtremeTech

(Photo: mlange_b/Flickr, CC BY-SA 2.0)
The most common question I’ve been asked about my DS9 upscaling work is whether there’s a way to make it work with files created via MakeMKV. MakeMKV is (and has been) a popular application for ripping DVDs for a very long time. I started with MakeMKV files and persisted in attempting to work with them without success for about six months. One of the most important pieces of advice I learned in this project came from the Doom9 forum, where I was advised to drop this approach and switch to DVD Decrypter. Once I did, I made more progress in six weeks than I had in six months.

Note: There is a difference between having an MKV file with media content in it and specifically using MakeMKV to create a file. We are discussing the latter scenario; the DS9 Project Update’s default output is in MKV. Also, as a reminder, the DS9 Upscale Project is a fan effort, unofficial, and not affiliated with Paramount or ViacomCBS.

I returned to the question this weekend after several readers raised it to me in email. Instead of attempting another round of fruitless filter passes, I decided to take a MakeMKV file apart and examine the underlying frames. What I found will shock you explains a few things as to why MakeMKV source turns to hash no matter how you treat it.

Much is typically made of the fact that MakeMKV creates an exact duplicate of the frames inside a DVD VOB file. On the one hand, this is true. In addition to extracting all of the frames from a MakeMKV-created file, I also extracted the frames from a VOB file directly. The frame-by-frame output between a VOB file and a MakeMKV file is identical.

This is actually part of the problem.

To be clear: Part of the reason why you can’t use a MakeMKV file in this process without hosing your motion is that MakeMKV creates variable frame rate files when used to archive an episode of Deep Space Nine, and AviSynth is not designed to edit variable frame rate files. Thus, AviSynth forces MakeMKV-derived output into a non-standard 24.6 fps rate in an attempt to harmonize the two. This destroys any chance of synchronizing audio. But the problem goes a little deeper than that.

DVDs: What You See Isn’t What You’ve Got

There are aspects to this situation that I am still piecing together, so I apologize if I’m a little vague in places. The first point I want to make is that there is a difference between what you see in a DVD stream when you play the file back — even if you go frame by frame — and the actual frame sequences stored on disk.

All of the content written to an NTSC DVD intended for playback in a DVD player is “conceptually interlaced”, to borrow a term from HomeTheaterHiFi. To quote:

It’s important to understand at the outset that DVDs are designed for interlaced displays. There’s a persistent myth that DVDs are inherently progressive, and all a DVD player needs to do to display a progressive signal is to grab the progressive frames off the disc and show them. This is not exactly true…

DVDs are based on MPEG-2 encoding, which allows for either progressive or interlaced sequences. However, very few discs use progressive sequences, because the players are specifically designed for interlaced output. Interestingly, while the sequences (i.e., the films and videos) are seldom stored progressive, there’s nothing wrong with using individual progressive frames in an interlaced sequence…

The encoder can mix and match interlaced fields and progressive frames as long as each second of MPEG-2 data contains 60 fields, no more, no less (or 50 fields per second for PAL discs)… In short, the content on a DVD is interlaced conceptually. (Emphasis added)

Recreating perfect 23.976 fps progressive output from 29.97 fps interlaced input requires an inverse telecine. Telecine is the process of converting 24 fps film to be displayed on 29.97 fps video. One typical pattern of telecine in the NTSC universe is 3:2 pulldown, in which interlaced frames are created in a specific sequence to convert ~24 progressive frames into 60 fields a second for broadcasting frames.

It is possible, in some cases, to unwind this type of content back into perfectly progressive frames if the DVD is mastered properly, which is where the idea that DVD footage is itself  progressive comes from. DVD footage can display as fully progressive, but the standard assumes 59.94 fields per second, which we translate to 29.97 frames. Those 29.97 frames can be displayed as 23.976 fps progressive thanks to the use of flags telling the player which fields to show during playback.

The 3:2 process. Unwinding this allows a 29.97 fps interlaced file to be turned back into 23.976 fps content.

Here’s where we get to the critical difference between using DVD Decrypter and DGIndex versus attempting to edit a MakeMKV-derived source file. When I dumped the contents of both types of sources to disk, I discovered a significant difference between them. The D2V-derived source contained the familiar 3:2 pattern we’d expect from progressive content that’s been put through telecine: Three progressive frames, followed by two interlaced frames. When you create a D2V file, it will specify a 29.97 fps frame rate unless your content is 100 percent film (meaning a perfectly progressive stream of 23.976 fps data encoded “conceptually interlaced” in 29.97 fps fields).

The MakeMKV-created file contained a very different frame sequence. Instead of 3:2, it contained entirely progressive frames encoded in the following pattern: 1-2-3-4-4 (frame 4 repeats) or 1-2-3-4-5-5 (frame 5 repeats). Internally, the frame rate of the MakeMKV-encoded file remains 29.97 fps. MakeMKV-created footage only returns to the standard 3:2 format during scenes that were expected to play back at 29.97 fps. Within the source, there are two types of content patterns encoded two different ways. 23.976 fps is one pattern, 29.97 fps is another.

Here’s how I know that’s true: By default, the output from a D2V-derived file is 29.97 fps, unless you change it in some way. A MakeMKV file and a D2V-created file at 29.97 fps track each other frame for frame, meaning Frame 500 from MakeMKV source contains the same moment in time as Frame 500 from the D2V-derived source. That doesn’t mean frame output between the two is always identical, however. Here’s Frame 327 from a 29.97 fps D2V-derived file broken down into its constituent frames, versus a Frame 327 from a MakeMKV-derived source.

I can’t embed the slider, so I’ll also embed a screenshot below:

MakeMKV frame on the left, DVD Decrypter + DGIndex frame on the right. Believe it or not, the frame on the right is the frame we want to process to unwind the frame rate properly. Please ignore Sisko’s forehead tumor, he’s sensitive about it.

In the MakeMKV file, Frame 327 is a shot of Sisko. In the D2V-derived file, Frame 327 is an interlaced frame containing data from the frame before (the two admirals sitting) and the frame to come (Sisko talking). Sisko’s mouth position in Frame 328 of the MakeMKV source is different than his mouth position in Frame 328 of the D2V source. This continues through Frame 329, but the output synchronizes again at Frame 330.

TDecimate might be able to theoretically scan the video stream and remove the duplicate images, but AviSynth doesn’t understand what the frame rate of the MakeMKV-derived source is supposed to be in the first place. The constant 29.97 fps D2V file DGIndex creates, in contrast, presents AviSynth with a data stream in a format it understands and can work with. There are ways to use a timecodes file to work with VFR in AviSynth, but the point is moot as soon as you get to Topaz Video Enhance AI, which does not support VFR input.

The reason why MakeMKV files don’t work in this project is that variable frame rate MakeMKV files do not represent the video data contained within the VOB in a manner that AviSynth can work with. The progressive 1-2-3-4-4 and 1-2-3-4-5-5 pattern encoded into MakeMKV files plays back smoothly because the file stream contains a plethora of hints telling the player software how to play it and which fields/frames should be displayed. Turning that stream into something that other applications can process without breaking motion is not easy. AviSynth can convert the file to a constant frame rate and keep audio synchronized, but not without breaking motion in the 29.97 fps sections.

I cannot speak to the exact reason why DGIndex can look at a VOB and create a D2V project file that outputs 3:2 pulldown, while the VOB and MakeMKV-created sources do not. But it’s this difference in output and frame rate that causes problems. And given that Topaz doesn’t support VFR file ingestion, the point is largely moot. The DVD set and DVD Decrypter are, as of this writing, both required for this process.

Now Read:

July 21st 2021, 5:01 pm

Event Horizon Telescope Captures Never-Before-Seen Detail of Black Hole Jets

ExtremeTech

Two years ago, scientists working with the Event Horizon Telescope (EHT) made history by releasing the first-ever photo of a black hole. The image confirmed many of our ideas about these massive collapsed stars, but the team didn’t stop there. The latest data release focuses on a different target, an active galaxy known as Centaurus A. You’ve probably seen images of Centaurus A in the past — it’s one of the brightest galaxies in the sky. You’ve never seen it like this, though. 

The Event Horizon Telescope is not a single instrument. It is composed of radio telescopes scattered around the globe. By observing the same object with each telescope, scientists are able to combine the data using atomic clock timestamps and lots of supercomputer cycles. It takes months to assemble EHT images, but the results are staggering. We could have never seen the black hole in M87 with a single telescope, and we’ve never seen Centaurus A in such detail, either. 

Centaurus A is of note because it has an active galactic nucleus. The black hole in the center is busy chowing down on matter while blasting a relativistic jet out into deep space. That makes Centaurus A the closest galaxies that are bright in radio frequencies. It’s the fifth brightest galaxy in the sky overall, making it a common target for amateur astronomers. 

The image above is one of many taken of Centaurus A in the past. The EHT images aren’t as visually arresting until you understand what these blobs are. As seen below, the EHT was able to zoom in on the jet of plasma escaping from the galactic nucleus. Despite being 10 million light-years away, the team was able to examine the structure of the jet in amazing detail. 

These images of the jet are 16 times sharper than any previous observation. The data appears to show that only the edge of the stream is glowing. The team notes that only the edges of the stream appear to be glowing when viewed up close. Some past analyses of other black hole jets have suggested this tube-like structure, but now we have more confirmation. The reason is unclear, but it may be due to the edges interacting with stationary gas clouds, which causes them to heat up and glow. 

Observations closer to the black hole show the tube narrowing into a cone, but the base is still very wide. That could mean the black hole’s accretion disk is the source, but some scientists believe these jets must be utilizing the rotational energy of the black hole to explode outward with such force. The EHT hasn’t answered all the questions scientists have about black holes, but it might point us in the right direction.

Now read:

July 21st 2021, 5:01 pm

The Amazon MMO New Worlds Beta Is Frying RTX 3090 GPUs

ExtremeTech

The beta for Amazon’s MMO, New World, dropped yesterday. It was quite popular, with reports of up to 187,000 simultaneous players at times. But if New Worlds was in the news for its popularity yesterday, it’s making waves today for an entirely different reason: It’s frying graphics cards.

Strange New World

It’s impossible to over-emphasize how odd this is. Back about 10 years ago, applications like Furmark were capable of damaging GPUs because they behaved more like thermal viruses than actual software. AMD and Nvidia both took steps to limit these kinds of workloads and built algorithms into their own drivers to detect them. Graphics cards also possess their own throttle limits and power regulation circuitry.

End users can hose an operating system installation by deleting the wrong files, but the only way to typically break a PC hardware component with software is to flash the UEFI/BIOS incorrectly. Even that’s a stretch, because we call UEFI/BIOS updates firmware, not software, and they come with extra risks.

There have been several tweets from EVGA users about RTX 3090’s dying after playing New World:

 

It is not clear if this issue is hitting any other company or graphics card, but there are multiple unhappy EVGA users reporting the problem in the company forums. EVGA is one company that acknowledged sending out some RTX 3090 GPUs with insufficient power circuitry last fall, but the company claimed these parts were only ever sent to reviewers and that none of them made it into retail.

It’s possible that’s true. This could be evidence of a different problem, unrelated to the initial issues. We don’t know yet if the issue is hitting cards from other companies. It’s been reported that the problem is the game’s menu screens, which run at uncapped frame rates, but most games — including games one runs with V-Sync disabled — don’t appear to hit frame rates that cause this kind of problem. Plenty of online games kick you back to the title screen if you spend too much time idle.

One could argue that New Worlds isn’t “killing” graphics cards so much as it’s demonstrating that some GPUs may not be adequately cooled, but that’s going to be cold comfort to anyone left trying to get a replacement GPU during a historic shortage. Hopefully, Amazon patches the game to remove this possibility and EVGA (as well as any other manufacturer) replaces cards as quickly as possible. Everyone who has purchased an RTX 3090 should still be covered under warranty, so that is not an issue.

A similar issue affected StarCraft 2 in the past, so this problem isn’t completely unknown. But it’s still an unusual problem, especially given that it seems to be hitting one GPU company.

Now Read:

July 21st 2021, 5:01 pm

Robot Collision Sets London Warehouse Ablaze, Delays Orders

ExtremeTech

Credit: Ocado

Online-only grocer Ocado was forced to cancel some customers’ orders last week following a fire at its London warehouse. This marks the second fire Ocado has experienced at one of its warehouses; this time, a three-robot collision is to blame.

The company announced via Twitter Friday that it would be canceling some orders due to a “major incident.” Soon after, the company released a statement disclosing that a fire had erupted at its Erith Customer Fulfillment Centre following the collision of three robots on Ocado’s automated grid. Thankfully, no employees were hurt.

East of London, Ocado’s Erith CFC houses the company’s largest robotic grocery fulfillment grid. A 2018 feature from The Verge describes the AI-powered grid as a “huge chessboard, populated entirely by robots” each about the size of a washing machine. The robots perform basic tasks like lifting, sorting, and moving groceries and bringing them to human “pickers,” who stand on the sidelines and pack individual customers’ orders. Despite (or perhaps because of) the robots’ simplicity, they’re able to fulfill approximately anywhere from 65,000 to 150,000 orders each week. They’re even able to self-diagnose when they experience a bug and take themselves to “the pits” for a fix if the issue is severe enough.

Ocado’s technology, however, requires its robots to run on extremely thin margins—literally. Due to the construction of the grid, robots may pass within 0.2 inches (5 millimeters) of each other as they buzz by to fulfill their respective orders. This is where last week’s fire appears to have found its origin; when three of the robots crashed, they burst into flames, resulting in an evacuation of over 800 employees and a fall in the company’s share price by about 3 percent.

“The damage was limited to a small section of less than 1 percent of the grid having been successfully contained by our fire attenuation measures, many of which were implemented following our thorough fire safety review in 2019,” Ocado said in a statement. “While the incident has caused some short-term disruption to operations, the vast majority of customer orders are being fulfilled in other parts of the Ocado network.” The company promised via Twitter to contact customers whose orders would be impacted as soon as possible.

Ocado implemented those thorough safety measures in 2019 after its Andover warehouse burned down that same year. The company has since revealed the fire was started by an electrical failure in which the building’s sprinkler system was disabled.

Now Read:

July 21st 2021, 5:01 pm

Prep for CompTIA Certification with this $30 Training Course Bundle

ExtremeTech

In the information technology industry, it’s just not possible to let your information go out of date as well as continue to grow and succeed in the role you have, or land interviews for the role you want. Maintaining your industry and practical knowledge is a must, and achieving certifications offered by the relevant industry authorities is a necessary fact of working in the IT industry. Being proactive about achieving these certifications and keeping your knowledge up to date is an easy way to keep a competitive edge – it’s the best way to express the care you have for your career and your industry, priming you for promotions, new roles, and to even just enjoy the work you do a little more.

Right now, you can get the ITProTV: The Core CompTIA Certification Prep Bundle for the reduced price of $29.99, a great discount of 96 percent of the full purchase price of $825. Bolster your knowledge, work towards essential certifications, and get on the path to success by completing this essential course.

The bundle, which contains three vital courses divided up into 387 easily digestible lessons, spans 168 hours of essential content preparing you for CompTIA certifications. Master fundamental IT skills on a range of common devices and learn skills that are core to IT technician and expert role – including installing, configuring, optimizing, troubleshooting, repairing, and upgrading devices, as well as performing preventative maintenance on devices such as PCs and operating systems. The course bundle includes the required preparation for CompTIA A+ Core 1 and Core 2, CompTIA Network+, and CompTIA Security+ SY0-601.

The courses are taught by highly rated instructors from ITProTV, rated 4.8 stars out of five on Trustpilot, providing a range of e-learning resources aimed at addressing the global IT skills gap.

The ITProTV: The Core CompTIA Certification Prep Bundle is on sale for $29.99, helping you reach your professional goals and improve your knowledge in preparation for CompTIA certification.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 20th 2021, 6:03 pm

Blue Origin Successfully Launches Humans Into Space

ExtremeTech

Blue Origin has successfully launched humans into space today on its New Shepard rocket. Wally Funk, Mark Bezos, Jeff Bezos, and Oliver Daemen all traveled above the 100km boundary that marks the internationally accepted von Karman line, the point at which the atmosphere of Earth is said to give way to space. Richard Branson’s flight a few days ago reached 85km above sea level, which is high enough to qualify as space under NASA and the US Air Force’s definition of the term. Depending on how you choose to count, Bezos is either the first billionaire in space or the second. Funk and Daemen are, respectively, the oldest (82) and youngest (18) people to travel to space.

The first Blue Origin human flight was a success in two ways. First and most importantly, the capsule touched down safely. Secondly, the Blue Origin rocket again landed successfully and can be re-used in the future. In and of itself, catapulting two billionaires into space (or near-space) may not seem like much of an achievement, but this is a watershed moment in the history of spaceflight. For the first time in the history of our species, space is available to more than a bare handful of individuals who undertake months of rigorous training to prepare for the flight. The video below is queued up to the beginning of Blue Origin’s post-flight press conference.

Here, the goals of Blue Origin and Virgin Galactic diverge. Virgin Galactic is focused on space tourism. Blue Origin has built the New Shepard explicitly for the suborbital space tourism market, but the company has another rocket, New Glenn, intended for government and commercial satellite launches. New Glenn has had multiple development slips and the rocket is currently scheduled for launch no earlier than Q4 2022. Successfully launching the Bezos’, Funk, and Daemen into space distracts from this kind of problem by notching a genuine win in Blue Origin’s belt.

The significance of Bezos and Branson’s flight is not that we tossed a couple of billionaires into space for a few minutes. Since the dawn of the space age, virtually all access to space has been controlled and funded by various governments. Now, manned suborbital flights are available to (and from) the private sector. It’s hard to argue this is a bad thing, considering how completely Congress has sabotaged the SLS. Forced by law to use archaic Space Shuttle components, NASA has committed to paying $146 million per RS-25 engine. The SLS uses four. To add insult to injury, the RS-25–which was a reusable engine when originally built by NASA and deployed on the Space Shuttle–will drop into the ocean during the SLS’ launch.

The Congressional decision to treat NASA more like a jobs program while mandating the use of outdated and incredibly expensive components has created an opportunity for private industry to pick up the slack. The work done by companies like Blue Origin, SpaceX, and Virgin Galactic shows how the industry is rapidly evolving.

For now, space tourism remains incredibly expensive and available to only a handful of people. This may well change in the future if suborbital flights prove popular and more companies experiment with different vehicle designs. The fact that commercial space travel is currently limited to a few minutes in suborbital flight doesn’t change how this is a momentous occurrence. Granted, the pool of potential tourists who could afford to pony up this kind of cash is still small, but Blue Origin and Virgin Galactic have still widened the frontiers of space flight in proving the possibility of private sector crewed launches.

Now Read:

July 20th 2021, 6:03 pm

ET Deals: Dell XPS 8940 SE Intel 11th Gen Core i7 and Nvidia RTX 2060 Super Gaming Desktop for $1,09

ExtremeTech

Today you can save $600 on a gaming desktop from Dell that comes equipped with an 11th Gen Intel Core i7 processor and an Nvidia GeForce RTX 2060 Super graphics card.

Dell XPS 8940 SE Intel Core i7-11700 Gaming Desktop w/ Nvidia GeForce RTX 2060 Super GPU, 16GB DDR4 RAM, 512GB NVMe SSD and 1TB HDD ($1,099.99)

Dell’s new XPS 8940 SE features an updated design and it comes loaded with powerful processing hardware that’s able to tackle just about any task you throw at it. The high-end Intel Core i7-11700 with its ten CPU cores is well suited for running numerous applications at the same time. It also has RTX 2060 Super graphics card from Nvidia that makes the system able to run games exceptionally well. The GPU can also be used to help with editing images and videos when you’re not gaming. Currently, you can get one of these systems from Dell marked down from $1,699.99 to just $1,099.99 with promo code DTXPSSEAFF79.

Apple iPad Air 256B 10.9-Inch Wi-Fi Tablet ($639.00)

Apple’s new iPad Air features a 10.9-inch display and is powered by Apple’s powerful A14 Bionic chip that makes it exceptionally fast. The tablet is also compatible with Apple’s Magic Keyboard that can essentially turn this device into a compact laptop, making it a highly versatile device. Currently you can get one from Amazon marked down from $749.00 to just $639.00.

Dell S2721QS 27-Inch 4K IPS Monitor ($299.99)

Dell’s S2721QS is a 27-inch monitor that sports a 4K IPS panel with HDR and FreeSync support. The monitor can also be used for detailed video editing work as it covers 99 percent of the sRGB color gamut, and it also has a built-in pair of 3W speakers. Currently Dell is selling these monitors marked down from $489.99 to $299.99.

Acer Chromebook Spin 311 Intel Celeron N4020 HD 11.6-Inch Convertible Laptop ($269.99)

Acer designed this Intel Celeron-based Chromebook to be a convertible laptop, which means the display can rotate around 180 degrees to be used as a tablet. The system also has 4GB of LPDDR4 RAM to help keep it from slowing down, and it can reportedly last up to 10 hours on a single charge. Though it isn’t the fastest laptop around, it should work well for someone that just needs to surf the web or write text documents. You can buy one of these systems now from Amazon marked down from $499.00 to just $269.99.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 20th 2021, 6:03 pm

GlobalFoundries CEO Dismisses Intel Buyout Rumors, IPO Moves Ahead

ExtremeTech

The CEO of GlobalFoundries, Thomas Caulfield, says that GF is sticking to its plans for an IPO next year, and no Intel buyout offer or discussion is currently happening.

“There’s nothing there in that discussion,” Caulfield said to Bloomberg. Speculation around the idea of an Intel buyout, he says, was driven by the fact that GlobalFoundries’ IPO is happening in the not-too-distant future, at a time when semiconductor foundries are hot properties.

Supposedly, Intel has studied the idea of buying GF but has not formally begun any discussion or negotiations to do so. Intel and the Mubadalla Investment Company are not in talks. Caulfield thinks it’s unlikely Mubadalla would want to come to such an agreement. “I think they are interested in holding onto GF,” Caulfield said. “They think this asset has become a star in their portfolio.”

That may be true. When GF attempted to compete on the leading edge, the foundry lost a great deal of money. Retrenching on older nodes has undoubtedly improved profitability, especially given the recent demand for older semiconductors. GF has announced that it will invest $1B towards improving its own fab output and that the company will also build a new facility in Malta, near its existing Fab 8. Details on the type of fab and what process node or customers it will target have not been disclosed.

The company is not expected to return to the leading edge, but it would not be unusual for GF to add 7nm manufacturing capabilities over time. The leading edge is now on 5nm, with 3nm in risk production. If GF adds 7nm, it’ll indicate that enough manufacturers eventually shifted over to the node to make it worth installing as a second-source option. 7nm is the last node to have a DUV lithography variant — everything beyond 7nm requires EUV from every manufacturer — so it likely marks the end of the line for second-tier foundries, until they and their customers either opt for EUV or pivot to alternatives. GlobalFoundries has had some success with its 22FDX and the company claims 12FDX is not canceled, just delayed for reasons of ecosystem maturity and customer demand.

GlobalFoundries has committed $2.4 billion in total for fab expansion worldwide, not counting the price of the upcoming Malta facility. While that’s not large compared with Intel, TSMC, or Samsung, it’s a significant amount for a second-tier foundry. The idea that Intel would buy the company was always a bit far-fetched, but it isn’t crazy. For all the difficulties of meshing corporate cultures, the fastest way to acquire a great deal of experience in semiconductor manufacturing is to buy an existing facility.

No deal between Intel and GF means that Intel will build out its reputation for client foundry work using its existing factories, plus the new facilities under construction in Arizona (and, in the future, possibly in Europe). The shockwaves from the pandemic and the shake-up it created across the semiconductor industry will be felt for years to come, as semiconductor companies pick up new deals and expand capacity.

One caveat to keep in mind? Companies occasionally quash true rumors. Back in 2017, an Intel spokesperson told journalists, “The recent rumors that Intel has licensed AMD’s graphics technology are untrue.” When Hades Canyon showed up, it put the lie to that statement. Whether the company was dodging around the conventional meaning of “license” is unclear, but it isn’t uncommon for companies to insist major changes are not happening right up until they do.

Now Read:

July 20th 2021, 6:03 pm

GPU Prices Are Stuck Well Above MSRP

ExtremeTech

Earlier this month, it seemed GPU prices might fall back to normal more quickly than anticipated. Once China cracked down on cryptocurrency mining, the price of GPUs plunged worldwide. Unfortunately, the decline has stalled out at least in Europe. Things in the United States don’t look much different.

Thanks to 3DCenter.de, who has been keeping an eye on this problem for the last few months and published these results.

Image by 3DCenter.de

Nvidia GPU prices have dropped 3 percent but AMD prices have risen 3 percent, so the net change is zero relative to July 4. Other regions haven’t shown the same declines that Germany has, according to 3DCenter, but the drops are stalled out everywhere. There are several potential reasons why we haven’t seen declines continue.

First, keep in mind that demand for GPUs likely remains elevated well above baseline with respect to seasonality and typical market conditions. There are a lot of consumers who were never going to pull the trigger on an Ampere or RDNA2 GPU at 2x – 3x MSRP. As GPU prices start falling, more buyers will re-enter the market. Given how over-stretched supply already is, this could prop prices up for the foreseeable future.

One of the major problems of the pandemic that we haven’t really fixed yet is the impact on shipping times. At this point, nearly 18 months into the pandemic, you’ve probably had at least a package or three take much longer to arrive than it was supposed to. Dwell times in Yantian, China have improved, but were still at 12.9 days as of July 7 (and 25 days in June). Erratic order deliveries and stockpiling have still hit the global economy and we haven’t sorted the problems out completely yet. Reports from last week suggest delays continue to bedevil US facilities to one degree or another.

Another factor likely dragging on price drops is that various middlemen and resellers bought GPUs at inflated prices to sell at even more inflated prices. AMD and Nvidia don’t make more money per GPU when the retail sales price goes up because both companies negotiate contracts with their ODMs in advance. The ODMs raised prices to cover their own increased material for DRAM and Ajinomoto build-up film costs and resellers then raised prices in turn, both to secure additional profits for themselves and to cover increased manufacturing costs.

A reseller who bought a tranche of GPUs at 150 percent of MSRP and sold them at 300 percent MSRP isn’t going to immediately cut prices. Instead, they’ll try to clear inventory that they purchased at a higher cost before cutting the price again. Finally, many companies are making a lot more money from semiconductor sales than they did previously. Companies do not necessarily leap to cut prices of their own accord in a situation like this. So long as demand is outstripping supply, there’ll be less market pressure for them to do so. Whether you view that as a matter of prudence or greed will probably depend on how badly you need a new GPU.

Prices in the secondhand market may also stay high for a while longer if demand is strong in that venue. A lot of people who bought GPUs in the last six months paid considerably more for them than MSRP. They’re going to push for resell prices that are as high as possible to recoup as much money as they can. Once demand starts dropping, we should see used card prices come down as well.

But as to when that’ll be? I don’t know. I’m not sure anybody does. Companies continue to warn that shortages could stretch in 2022, but we’ve also heard that there have been improvements. We’re now deep into July, and AMD has previously said it expected its shortages to start improving in the back half of the year. I’d love to say GPU prices will be back to normal by Thanksgiving, Christmas, or early 2022, but it’s hard to predict where the market will be six months from now.

Now Read:

July 20th 2021, 12:17 pm

New Study Predicts Teeny Tiny Mountains on Neutron Stars

ExtremeTech

Physicists are still arguing over whether black holes have “hair,” but we’re pretty sure neutron stars have mountains. These dead stars are extreme in every respect, from their magnetic field to gravitational influence, but the only thing extreme about the mountains is how small they are. A new analysis of neutron star physics predicts the “mountains” on the surface are less than a millimeter tall

A neutron star is what’s left of a medium-large star (10-25 solar masses) after it exhausts its nuclear fuel and explodes in a supernova. Stars that are a bit larger can collapse into black holes after expiring, and smaller ones like the sun will become white dwarfs. That means neutron stars are the densest objects known aside from black holes. They’re more massive than the sun even after blowing their top but are only a few miles in diameter. A spoonful of neutron star matter would weigh billions of tons. 

Neutron stars are compressed into an almost perfect sphere by their incredible gravitational pull — “almost” is the keyword here. In the past, scientists have estimated that mountains rising up on the surface of a neutron star could be up to a centimeter tall. However, that assumption relies on an understanding of neutron star structure that is not born out by the new research, which was conducted at the University of Southampton by Ph.D. student Fabian Gittins. 

Your average neutron star compared to New York City.

The researchers devised a new model that allowed them to subject a realistic virtual neutron star to various forces. It all comes down to how the star’s layers behave — something we cannot know for sure until we see a neutron star up close. However, this work may get us closer to the truth. The previous estimates of centimeter-scale mountains relied on the assumption that the elastic crust of a neutron star was near the breaking point across the entire surface. Instead, the new model predicts that breaks in the crust will cover a larger surface area at a lower altitude. Thus, mountains that are only a fraction of a millimeter tall. 

The extreme physics at work inside a neutron star are still unclear, but this work could get us closer to the truth. It’s possible scientists will be able to measure these mountains in the future. Theoretically, deformations in the surface of a spinning neutron star should be visible in gravitational waves. However, we have yet to see waves from a single neutron star. So far, we can only detect cataclysmic events like two neutron stars colliding. As more sensitive detectors come online, we might be able to size up the mountains in real life.

Now read:

July 20th 2021, 12:17 pm

Microsoft Won’t Restrict DirectStorage Support to Windows 11

ExtremeTech

Microsoft appears to be bending a bit in terms of which Windows 11 features will actually be supported in Windows 10. When it announced its upcoming operating system, the company told users that Windows 11 would be required if gamers wanted to take advantage of upcoming features like DirectStorage. Microsoft has now clarified that Windows 10 gamers will get many of the advantages of DirectStorage, though certain facets of the technology will still require Windows 11.

There are three principal features of DirectStorage, two of which will be coming to Windows 10. The DirectStorage API provides the option to batch I/O operations in DX12 style, “relieving apps from the need to individually manage thousands of IO requests/completion notifications per second.” The use of the GPU for data decompression is also supported in both operating systems. Using the GPU in this fashion improves load time and streaming performance.

The last features — the ones reserved for Windows 11 users — are various unspecified storage stack optimizations. According to Microsoft, “On Windows 11, this consists of an upgraded OS storage stack that unlocks the full potential of DirectStorage, and on Windows 10, games will still benefit from the more efficient use of the legacy OS storage stack.”

In order to take advantage of these features, Windows 10 users will need to be running at least 1909. This isn’t much of a requirement considering the feature hasn’t even been released yet, so the bulk of Windows 10 systems should be able to run it just fine. Microsoft notes that game developers only need to implement DirectStorage once “and all the applicable benefits will be automatically applied and scaled appropriately for gamers.” Microsoft claims that DirectStorage will run on older hardware just as well as current products do, so gamers who are stuck on spinning disks for whatever reason won’t be penalized for that — not, at least, beyond the intrinsic punishment of having to use a hard drive in the Year of Our Lord 2021.

Microsoft has previously said that taking advantage of DirectStorage will require a 1TB NVMe SSD and a GPU compatible with DX12 Ultimate — meaning nothing below Turing for Nvidia or RDNA2 for AMD. It is not clear if either of these requirements will be adjusted. Clearly, there must be a fallback path for machines that cannot enable DirectStorage and there’s no evidence that Microsoft intends to use this as some kind of cudgel to push people to upgrade their storage.

Will PC Users Get Access to the Xbox Series S|X’s Fast Load Speeds?

Right now, the Xbox Series X’s load times are excellent — better than you’d typically see from a PC in an apples-to-apples comparison. This is less true when launching games for the first time, but the Xbox Series X’s Quick Resume puts it in a different league from anything in the PC space. I’ve bounced back into Fallout 4 within five seconds after not having played the game for a month. Not every game supports Quick Resume, but for the titles that do, the feature is game-changing. I find myself reaching for the controller more often when I only have a limited time to play because I can expect to be in-game so quickly.

It’s possible to duplicate this on a PC by keeping multiple titles open simultaneously, but many games don’t run well (or at all) in this mode. I’m not sure exactly how this feature would work on PC, but the option to cache games to disk and then restore them for play would be a great leap for PC gaming. But for Microsoft to bring the feature to PC gaming, it’ll presumably have to make some decisions about what kind of hardware is and isn’t supported. NVMe SSDs are presumably still required, possibly at PCIe 3.0 or 4.0 speeds. We should find out more as Windows 11 moves closer to launch.

As a PC gamer, Quick Resume is a feature I’d like to see on the platform, hopefully with support for at least PCIe 3.0 and 4.0-based NVMe storage. So long as there’s a seamless fallback option for slower devices, it’s a win for everyone.

Now Read:

July 20th 2021, 12:17 pm

Get a Lifetime of VPN Protection and Encrypted Messaging for Only $30

ExtremeTech

Sometimes it feels like our phones ring off the hook – it’s hard to get away from the robocalls, the marketing calls, and all those other unwanted points of contact through your phone. At the same time, the internet can be a particularly hairy place where it’s possible for your details to be compromised or stolen if you don’t have the proper protections in place. But it is simple to reduce the inconvenience of unwanted attention on both your phone number and your time on the internet, if you just know where to look for the proper protections, putting them in place to preserve your privacy and peace of mind.

Right now, you can get The Lifetime Mobile Privacy & Security Subscription Bundle for the reduced price of $29.99, a great discount of 91 percent off the full purchase price of $349. The bundle contains two key products that will help to keep you safe, secure, and unbothered, both by unwanted phone contact, and while you’re browsing or connected to the internet.

The Hushed Private phone line service enables you to easily set up a second secure phone number, meaning you can keep your true, primary phone line hidden for the important purposes of work, family, dating, or even pursuits like Craigslist. The service includes 1,000 minutes of talk and 6,000 SMS per year that automatically renews year to year – though you can add more minutes or SMS at any time. The service, which has a 4.6-star rating out of five on the Apple App Store, can make and receive calls and SMS  with numbers in the US or Canada.

Also in the bundle is a lifetime subscription to KeepSolid VPN Unlimited – one of the most popular VPN services offers you protection and anonymity online, preventing hackers from stealing your personal information like passwords, contact details, or banking information. It also masks your location with access to more than 400 VPN servers in 80 locations around the world, including the USA, the UK, Canada, Australia & Hong Kong.

The Lifetime Mobile Privacy & Security Subscription Bundle is now on sale for $29.99, helping you stay protected and anonymous both online and on your phone.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 19th 2021, 6:30 pm

How to Download and Install Windows 8.1 for Free (Updated)

ExtremeTech

Update 7/19/2021:  Windows 8.1 is long outdated, but technically supported through 2023. If you need to download an ISO to reinstall the full version of the operating system, you can download one from Microsoft here.

If you are still using Microsoft Windows 8.1, we recommend you at least begin considering what OS you will use in the future. It’s mid-2021 and Win 8.1 will shuffle off the mortal coil in January 2023. You can still qualify for a free upgrade to Windows 10 if you own a valid Windows 8.1 license, despite the fact that Microsoft formally ended its upgrade program five years ago. Additionally, Windows 10 has the same system requirements as Windows 8.1, so if you can run the latter, you can also run the former.

Even though Windows 11 has been announced, upgrading to Windows 10 has several advantages. First, Windows 11 may not support your hardware if it’s old enough to still be running Windows 8 or Windows 8.1. Windows 10, on the other hand, was compatible with virtually everything Windows 8.1 would run on. Windows 10 will also be supported through 2025, which gives you an extra two full years of support before needing to figure out an alternative solution.

If you are somehow still stuck on Windows 8.0 and do not want to go to the hassle of a full OS swap, we recommend running Windows Update immediately and downloading all available patches for your system, including the Windows 8.1 update, which will likely be offered to you by default. If you want to download just the Windows 8.1 update files, you can do so here.

Original story below, from 2013:

Windows 8.1 has been released. If you’re using Windows 8, upgrading to Windows 8.1 is both easy and free. If you’re using another operating system (Windows 7, Windows XP, OS X), you can either buy a boxed version ($120 for normal, $200 for Windows 8.1 Pro), or opt for one of the free methods listed below. To download and install Windows 8.1 for free, follow the guide below.

How to download Windows 8.1 for free

If you don’t want to wait for October 17 or 18, there are two options for downloading Windows 8.1: You can obtain a copy (and a license key) from a friend/colleague with an MSDN, TechNet, or DreamSpark (student) subscription, or you can download a Windows 8.1 RTM ISO from your favorite file-sharing website (The Pirate Bay, Mega, etc.)

While we’re not going to write a guide on how to obtain Windows 8.1 RTM from non-official sources, we will at least tell you to check the SHA-1 hash of the ISO that you download to make sure that it’s legitimate. If you hit up the MSDN Subscriber Downloads page, and then click Details under the version that you’ve obtained from elsewhere, you’ll find the SHA-1 hash. If you then use File Checksum Integrity Verifier (FCIV) on the ISO, the hash should match. If it doesn’t, assume the ISO has been compromised and download another. (But do make sure that you’re checking the right SHA-1 hash on the MSDN website; your ISO might be mislabeled).

The other easier, and completely legal, option is to download the Windows 8.1 Preview from Microsoft. It’s not as snappy as the final (RTM/GA) build, though, and has quite a few bugs/missing features. Bear in mind that if you go down this road, upgrading to a real version of Windows 8.1 will require a few more steps (discussed in the next section).

How to install Windows 8.1 for free

Once you have the Windows 8.1 ISO on your hard drive, the installation process is painless. Before you begin, you should consider backing up your important files and documents, but it’s not really necessary. You should also ensure that you have plenty of free hard drive space (20GB+).

If you’re already running Windows 8 and you downloaded the RTM ISO from somewhere other than the Windows Store, you can install Windows 8.1 by mounting the downloaded ISO in Explorer by double-clicking it, and then running the installer. If you’re on Windows 7, XP, or (bless your soul) Vista, you’ll need to burn the ISO to a USB thumb drive or DVD, or mount the ISO using a third-party virtual drive tool, like Magic ISO.

If you already have Windows 8, and you waited for the official release date, installing Windows 8.1 is as simple as visiting the Windows Store and downloading the free update.

In both these cases, the upgrade process should be very smooth, with your apps and settings fully preserved. If you upgrade from Windows 8.1 Preview, however, you will lose your installed apps, unless you first run a cversion.ini removal utility.

Once you’ve installed Windows 8.1, you should check out our extensive collection of Windows 8.1 tips and tricks, and be sure to check our Windows 8.1 review and hands-on impressions to ensure that you’re making the most of all the new features.

Now Read:

Sebastian Anthony wrote the original version of this article. It has since been updated with new information.

July 19th 2021, 6:30 pm

Tesla Rolls Out $200 Monthly Subscription for ‘Full Self-Driving’

ExtremeTech

Tesla has finally rolled out the promised Full Self-Driving (FSD) subscription service. For $199 per month, owners of supported Tesla vehicles will be able to get a taste of the company’s most advanced self-driving features. That might not sound like a great deal, but the only option previously was an up-front $10,000 package. However, not all of the Tesla faithful are happy. Some vague language on Tesla’s part means that vehicles marketed as having full Autopilot capabilities might need an additional $1,500 hardware upgrade to use FSD

All Tesla vehicles come with basic self-driving Autopilot capabilities, allowing them to remain in the lane, adjust speed based on traffic, and so on. The Full Self-Driving package adds advanced features like navigation on Autopilot, summon, and lane changing on Autopilot. The subscription allows people to try out the more advanced self-driving capabilities without spending so much up-front. However, it could cost more in the long run if you then find you can’t go back to basic Autopilot. 

Despite calling this feature “Full Self-Driving,” it’s actually SAE level two autonomous technology. That means the vehicle can drive from point A to point B under normal conditions without human interaction, but the driver must remain aware and be able to take over from the car at any moment. Plenty of other cars could claim the same level of autonomy, but Tesla’s Autopilot does tend to understand road conditions better, and a driver probably won’t have to correct the AI as often. However, to be level three, Tesla would have to be able to handle all vehicle operations when self-driving is engaged. The human behind the wheel would only need to step in when a level three computer requests help. 

Tesla learned several years ago that the computer hardware in its cars wasn’t powerful enough for FSD mode. It designed the latest “Hardware 3.0” with a custom chip for this purpose. Vehicles sold between 2016 and 2019 may have the older Hardware 2.0 and 2.5, even though they were advertised as having self-driving capabilities. Anyone with an older computer needs to drop $1,500 on a new system before they can even subscribe to the $200 add-on service. 

It’s not like Tesla is a tiny startup anymore — it’s almost 20 years old and has tens of thousands of employees. There’s no reason it should still be confusing customers with esoteric feature names and then hitting them up for more money when they decide to change the definition of words. This is becoming a worrying trend for Tesla. The company recently reneged on solar roof contracts when it decided it wasn’t charging enough for installation. Hey, Elon Musk didn’t become one of the richest people on the planet giving things away for free.

Now read:

July 19th 2021, 6:30 pm

Nvidia’s ARM-Powered Linux RTX Demo Is a Warning Shot to x86, Microsoft

ExtremeTech

Nvidia has showcased RTX graphics running on ARM CPUs as a way of highlighting the potential performance non-x86 processors can offer to consumers — and to make the point that gaming does not require an x86 microprocessor. There are a few caveats to the presentation (we’ll address them), but make no mistake: This is as much a warning to Intel and AMD as it is an opportunity to crow about Nvidia’s graphics prowess.

Nvidia showed one game, Wolfenstein: Youngblood, and one tech demo (the Bistro, from the Open Research Content Archive (ORCA). The video Nvidia posted for GDC (embedded below) indicates that the testbed is running an RTX 3060 on a MediaTek Kompanio 1200, which is based on the MT8195 SoC. MediaTek calls these parts “chipsets,” but that word is typically used to refer to motherboard control logic, not an SoC.

MediaTek is generally known as a budget device developer, but the MT8195 looks like it has some heft to it. The SoC is built on TSMC’s 6nm process, which offers higher logic density compared with its 7nm node. TSMC built the 6nm node for companies that wanted to take advantage of its logic density but didn’t want to jump ship for 5nm and its new design rules. The CPU contains four Cortex-A78 cores and four Cortex-A55 cores with an AI processing unit, which MediaTek confusingly refers to as an “APU.” The chip uses quad-channel LPDDR4x 2133, but we don’t know how wide the channels are. This is a modest chip overall, however — the MT8195 only hits a maximum default clock speed of 2.2GHz on the Cortex-A78 cores. The SoC is typically used in Chromebooks.

We suspect that’s part of Nvidia’s point. The MT8195 is not the kind of CPU we’d expect to see in an actual ARM-powered gaming system. If it can run RTX games well when paired with just an RTX 3060, this implies good things about the products that could appear in the future.

Nvidia ported five specific capabilities to Linux: DLSS (presumably used above to improve performance), RTX Direct Illumination, RTX Global Illumination, Nvidia Real-Time Denoisers, and the RTX Memory Utility. DLSS is used to improve image quality by upscaling lower resolution output to higher resolutions, Direct Illumination supports dynamic lighting and global illumination is used to calculate ray bounces. Real-time denoising is self-explanatory and the RTX Memory Utility is said to optimize GPU memory usage.

The fact that the system is running Linux (Arch Linux) throws an interesting wrinkle into all of this. Windows on ARM support exists, but Nvidia chose not to demonstrate a system running Windows. Linux isn’t normally thought of as a gaming OS, but Valve has announced that its upcoming Steam Deck will ship with SteamOS and will run a number of Windows games through Proton, its fork of Wine. The Steam Deck relies on an AMD APU, not an Nvidia solution, but this is still an interesting demonstration of gaming performance

Nvidia isn’t going for broke with a major product announcement; GTC is a developer conference more than it is a consumer-facing event. The message here is unmistakable, if low-key: Neither x86 nor Windows are intrinsically required for high-performance gaming.

Nvidia has more reason than most companies to hammer that message. If the company’s attempt to buy ARM is approved by regulators, it will have an opportunity to make a play to become the simultaneous provider of leading-edge GPU and CPU solutions for gaming. Intel is making its own play for that title with its Xe HPG lineup of products and AMD already occupies that position if you’re a console gamer. But no single manufacturer has ever claimed a dominant market position in both CPUs and GPUs when it comes to supplying the PC gaming market. Showing off ray tracing in Linux is good for Linux as a gaming OS. Showing it on a MediaTek MT8195 is good for Nvidia. There are no plans to release this version of Youngblood commercially, according to the video, but this probably won’t be the last time we see an ARM CPU + Nvidia GPU sharing space in a laptop or desktop.

Now Read:

July 19th 2021, 6:30 pm

ET Deals: Dell U4320Q UltraSharp 43-Inch 4K Monitor for $869, Fitbit Charge 4 Fitness Tracker for $9

ExtremeTech

Today you can get your hands on a high-end 4K 43-inch Dell monitor with nearly $300 cut off the retail price.

Dell Ultrasharp U4320Q 43-Inch 4K USB-C Monitor ($869.99)

Dell engineered this model to be a large display with tons of desktop real estate. This makes it easier to multitask on multiple windows at the same time. The display also has a high 4K resolution and a USB-C port that doubles as both a video connection and a charging port for compatible notebooks. Right now you can get this display from Dell marked down from $1,159.99 to just $869.99.

Fitbit Charge 4 ($98.95)

The new Fitbit 4 features new built-in GPS and SPO2 sensors along with Spotify controls and a battery that can last for up to seven days. Currently you can get this helpful fitness device from Amazon marked down from $149.95 to just $98.95.

Dell Vostro 3681 Intel Core i5-10400 Desktop w/ Intel UHD Graphics 630, 8GB DDR4 RAM and 256GB NVMe SSD ($549.00)

This compact desktop PC comes equipped with an Intel Core i5-10400 processor that gives it solid performance. In general it would work well as an office PC or home desktop. The system has a 256GB NVMe SSD as it’s main storage device that allows it to boot quickly. It also has a DVD-ROM drive and it has four USB ports on the front to make connecting devices easy. Currently you can get this desktop from Dell marked down from $998.57 to just $549.00.

Intel Core i5-10600K 4.8GHz Six-Core Desktop Processor ($214.99)

Upgrade your desktop with one of Intel’s Core i5-10600K processors. This chip has six CPU cores that can operate at speeds of up to 4.8GHz giving you exceptional performance. It also supports Hyper-Threading to improve multitasking performance, and it has a low-power iGPU as well that’s perfect for everyday non-gaming use. Amazon is currently selling these CPUs marked down from $263.00 to just  $214.99.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 19th 2021, 6:30 pm

4K vs. UHD: What’s the Difference?

ExtremeTech

Now that 4K displays are mainstream, let’s look at two terms that have become conflated with one another: 4K and UHD, or Ultra HD. TV makers, broadcasters, and tech blogs are using them interchangeably, but they didn’t start as the same thing, and technically still aren’t. From a viewer standpoint, there isn’t a huge difference, and the short answer is that 4K is sticking, and UHD isn’t — though high-quality Blu-ray drives are sometimes marketed as 4K Ultra HD. But there’s a little more to the story.

4K vs. UHD

The simplest way of defining the difference between 4K and UHD is this: 4K is a professional production and cinema standard, while UHD is a consumer display and broadcast standard. To discover how they became so confused, let’s look at the history of the two terms.

The term “4K” originally derives from the Digital Cinema Initiatives (DCI), a consortium of motion picture studios that standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4,096 by 2,160 and is exactly four times the previous standard for digital editing and projection (2K, or 2,048 by 1,080). 4K refers to the fact that the horizontal pixel count (4,096) is roughly four thousand. The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. A DCI 4K stream is compressed using JPEG2000, can have a bitrate of up to 250Mbps, and employs 12-bit 4:4:4 color depth. (See: How digital technology is reinventing cinema.)

Ultra High Definition, or UHD for short, is the next step up from what’s called full HD, the official name for the display resolution of 1,920 by 1,080. UHD quadruples that resolution to 3,840 by 2,160. It’s not the same as the 4K resolution made above — and yet almost every TV or monitor you see advertised as 4K is actually UHD. Sure, there are some panels out there that are 4,096 by 2,160, which adds up to an aspect ratio of 1.9:1. But the vast majority are 3,840 by 2,160, for a 1.78:1 aspect ratio.

A diagram illustrating the relative image size of 4K vs. 1080p — except that 4K should be labeled UHD or 2160p.

Why Not 2160p?

Now, it’s not as if TV manufacturers aren’t aware of the differences between 4K and UHD. But presumably for marketing reasons, they seem to be sticking with 4K. So as to not conflict with the DCI’s actual 4K standard, some TV makers seem to be using the phrase “4K UHD,” though some are just using “4K.”

To make matters more confusing, UHD is actually split in two. There’s 3,840 by 2,160, and then there’s a big step up, to 7,680 by 4,320, which is also called UHD. It’s reasonable to refer to these two UHD variants as 4K UHD and 8K UHD — but, to be more precise, the 8K UHD spec should probably be renamed QUHD (Quad Ultra HD). (Read: 8K UHDTV: How do you send a 48Gbps TV signal over terrestrial airwaves?)

The real solution would have been to abandon the 4K moniker entirely and instead use the designation 2160p. Display and broadcast resolutions have always referred to resolution in terms of horizontal lines, with the letters “i” and “p” referring to interlacing, which skips every other line, and progressive scan, which doesn’t: 576i (PAL), 480i (NTSC), 576p (DVD), 720p, 1080i, 1080p, and so on.

The reason this didn’t happen is that the number didn’t match the size of the resolution increase. “2160p” implies that the resolution is double that of 1080p HD, while the actual increase is a factor of 4. The gap between 720p and 1080p is significantly smaller than the gap between 4K and 1080p, though how much you notice the upgrade will depend on the quality of your TV and where you sit. Further complicating matters, there’s the fact that just because a display has a 2160p vertical resolution doesn’t mean it supports a 3,840 or 4,096-pixel horizontal width. You’re only likely to see 2160p listed as a monitor resolution, if at all. Newegg lists three displays as supporting 4K explicitly as opposed to UHD (4,096 by 2,160), but they’ll cost you. Clearly, these sorts of displays are aimed at the professional set.

Now that there are 4K TVs everywhere, it would take a concerted effort from at least one big TV manufacturer to right the ship and abandon the use of 4K in favor of UHD. In all honesty, though, it’s too late. The branding ship has sailed.

Sebastian Anthony wrote the original version of this article. It has since been updated several times with new information.

Now read:

July 19th 2021, 1:44 pm

Steam Deck Ship Dates Slip Into Q3 2022 as Pundits Debate Its Appeal

ExtremeTech

Valve only just opened reservations for its Steam Deck to anyone over the weekend, and the wait times are already up to six months long in some cases. As of this writing, the Steam Deck 64GB and 256GB models have an estimated availability date of Q2 2022, while the Steam Deck 512GB model is delayed until Q3 2022.

This puts an amusing spin on the debate over whether the Steam Deck will succeed. Valve has clearly tapped an interest in the PC market. An ordering system loophole spotted by Ars Technica disclosed that roughly 110,000 people pre-ordered one of the two higher-end systems in the first 90 minutes the system was available for pre-order. Figures for the base model were unknown at the time, but tallies suggest at least 10,000 base systems from North America alone.

Just because the calendar currently says Q2 2022 or Q3 2022 doesn’t mean it’ll actually take Valve that long to ship the hardware; there’s a chance the company will place larger component orders in response to this kind of presale interest. It’s also possible that there might be limited opportunity to meet demand, however. The semiconductor shortage has made chips of every sort hard to find. We’re hoping for things to have improved substantially by the end of the year, but long-term forecasts are difficult to come by right now.

Without some absolute figures, we can’t speak to how much demand there really is for the Steam Deck. Some people may have jumped the line to grab a reservation with no intention of actually buying the system. Putting down $5 to make a reservation is not a major commitment, after all.

How Will the Steam Deck Be Perceived?

It’s hard to know how PC gamers will take to a PC handheld. We’ve never had one before — not beyond a handful of niche devices produced by small companies, often as part of a crowdfunded project. Devices like the Onexplayer (Intel Tiger Lake, 1165G7) and the Aya Neo are fundamentally unappealing by comparison. Not only do they cost substantially more, they offer what’s likely to be weaker performance. The SoC inside the Steam Deck will offer four CPU cores and a full eight GPU CUs backed up by LPDDR5-5500 (estimated memory bandwidth is either 44GB/s or 88GB/s depending on bus width).

The Aya Neo comes close to competing with these specs — it offers a 4500U and LPDDR4-4266 — but the 4500U only has six CUs and is based on AMD’s old Vega architecture. If we assume that RDNA2 offers AMD even a 1.15x uplift at the same memory bandwidth, and the full eight CUs inside Steam Deck should be between 1.3x – 1.5x faster than the 4500U. This will ultimately depend on cooling, however — a poorly cooled system won’t be able to maintain full performance no matter what’s under the hood. Interestingly enough, the Aya Neo claims a 47 watt-hour battery and just 144 minutes of maximum playtime. That’s not as good as what Valve has implied. This could represent some marketing shenanigans on Valve’s part, or various power consumption improvements related to AMD’s latest APU.

The Tiger Lake-powered Onexplayer might be more competitive under the hood, but the developers equipped it with a 2560×1600 native display. That’s going to give the 1165G7’s integrated graphics a headache under the best of circumstances; the Switch targets 720p for a reason and 800p would be a better target.

The Aya Neo starts at $689, though it offers a lot more built-in storage. Steam Deck has just 64GB — a pittance, really — and opens at $400. The lack of storage is a serious drag, but the price is also substantially better. In comparison with what’s already in-market, the lower-end versions of the Steam Deck aren’t badly priced — assuming the console itself is any good.

I don’t think it’s possible to evaluate if the Steam Deck is going to be a hit with gamers yet, because we don’t know enough about three major factors: Ergonomics, battery life (both gaming and not), and overall compatibility between Linux (with some games supported via Proton, Valve’s fork of Wine) and Steam. The Steam Deck supports Windows, however, which may be one way Valve intends to deal with that potential problem.

Some co-workers and friends of mine have opined that the Steam Deck is expensive for what it offers. This, I think, will depend entirely on how well Valve picked its hardware. If AMD’s APU can offer acceptable performance in 1280×800, $400 for a reasonable gaming PC isn’t that expensive at all. Graphically, 1280×800 is ~1 MP, or roughly half the rendering load of 1080p. It shouldn’t be too difficult for a modern APU to offer acceptable performance at that resolution.

The big difference between PC gamers and Nintendo customers is that Nintendo customers know the Switch is the only way to play the company’s latest and greatest games, while the Steam Deck is just one method of playing on a PC. The more expensive models are the only reasonable storage options (the 64GB is eMMC, while the others are NVMe), but they also carry much less attractive prices. Valve is also promising that it wants to avoid joystick drift and that it believes customers will be happy with its hardware choices.

My prediction is this: If Valve can offer ergonomics, battery life, and overall compatibility that hits “good enough,” it could have a genuine hit on its hands. I don’t know how many millions of units that would translate into — let’s call it somewhere between 1-10 million, probably towards the bottom of that stack. No one has ever tried to build a device quite like this at a price point this low, so it’s hard to say how PC gamers will see the value. Anything less than a home run, and we’re looking at a smaller number of sales though they might still be respectable for a first-gen device.

I don’t have any plans to purchase a Steam Deck in the near future, but there is a version of this device that I could see appealing to people. Under the right circumstances, this system would offer all the performance of a solid laptop with additional storage and peripherals available via a USB-C port extender. The limited amount of expandability offered by a single port means a dock might be necessary to play, charge, and attach peripherals simultaneously, but this is not insurmountable.

I don’t think anyone would want to try and use the Steam Deck as a primary PC, but if what you want is a gaming/handheld optimized PC that can transform into a full productivity system for troubleshooting or emergency use, this thing might be great. I’m not jumping on any hype cycles, but I am curious to see if Valve pulls this off.

Now Read:

July 19th 2021, 11:18 am

Xiaomi Overtakes Apple to Become World’s Number Two Smartphone Maker

ExtremeTech

Chinese smartphone giant Xiaomi has had some ups and downs over the last few years, but it’s been all positive movement recently. The company just announced that the latest Canalys market report shows that it has passed Apple to become the second-largest smartphone maker in the world. That leaves Samsung to battle Xiaomi for the top spot. Based on the company’s impressive growth rate, it may only be a matter of time before it overtakes Samsung as well. 

If you live in the US, you probably don’t spend much time thinking about Xiaomi. It doesn’t sell phones at all in the US or Canada. Even if you import a Xiaomi phone, it won’t have the right cellular bands to work as intended. However, it’s hugely popular in markets like India, Europe, and yes, its native China. 

According to a company-wide message from CEO Lei Jun, Xiaomi has been on a tear. Its sales were up 83 percent year-over-year in the last quarter, which was just enough to edge past Apple, but that’s not because Apple lost ground. On the contrary, Apple’s smartphone business grew one percent. Xiaomi currently sits at 17 percent of the smartphone market to Samsung’s 19 percent. This is roughly where Huawei was a few short years ago. The US trade ban has hobbled that company by cutting off its access to Google services and new chip architectures. 

Xiaomi’s Mi 11 Ultra arguably has the most advanced camera array of any smartphone. There’s even a tiny LCD.

This is even more impressive when you consider the problems the company had as its sales ballooned in the mid-2010s. In 2014, Xiaomi hit the number three spot, but it was overextended and saw its market share drop for the next two years, eventually pushing it out of the top five smartphone OEMs. Since 2016, it has refocused on the products and markets where it does best, and the proof is in the pudding. 

Xiaomi has managed this feat without any presence in the US. While it’s not the biggest smartphone market in the world, it is the most profitable. Even if all Xiaomi cared about was beating Samsung, a modest expansion into the US could push Xiaomi into the top spot, but you only get one chance to do that right. Xiaomi pushes the envelope with hardware, but its take on Android, called MIUI, is more heavily modified than even Samsung’s phones. MIUI might turn some US consumers off even if Xiaomi’s hardware is great (it usually is). And with the increasing government scrutiny of Chinese tech firms, there’s no guarantee Xiaomi will want to take the risk. Huawei tried to muscle into the US smartphone market, and look what happened.

Now read:

July 19th 2021, 11:18 am

This Smartphone Gaming Controller is Now on Sale for Over 30 Percent Off

ExtremeTech

Gaming on the go is one of the most effective ways to kill time while you’re traveling, waiting for an appointment, or commuting. Our smartphones are powerful devices capable of running many of the most complex of games, but they can often be hard to control due to their small screens requiring miniaturized controls. Attaching a separate control that has been designed for the job of controlling your games is a no brainer that makes gaming on the go easier and more fun.

Right now, you can get the Serafim S1: Multi-Platform Gaming Controller On the Go for the reduced price of $45.99, a great discount of 34 percent off the full purchase price of $69, allowing you to have a better gaming experience, wherever you are.

The Serafim S1 Is a specially designed gaming device that offers you increased control, comfort, and precision in your gaming endeavors. It physically attaches to your smartphone so that you can use it like the handheld mode of the Nintendo Switch. It connects with Bluetooth for seamless control of your games and apps, and features high adaptability with two joysticks and 12 buttons, including A/B/X/Y and trigger buttons. You’re able to map these buttons to the virtual buttons on your phone’s screen, meaning you can create a totally custom controller for the games you want to play. The Serafim S1 is also compatible with a wide variety of 10 other platforms including Steam, Windows, Nintendo Switch, and Google Stadia so you can enjoy using it with your favorite console and desktop games like PUBGMInjustice 2, and Mario Kart Tour.

The Serafim S1: Multi-Platform Gaming Controller On the Go is now on sale for $45.99, saving you more than $23 off the list price, meaning you can more easily enjoy your gaming, wherever you are.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 16th 2021, 8:32 pm

ET Weekend Deals: Over $400 off Dell Alienware M15 R3 Nvidia RTX 2070 4K OLED Gaming Laptop, Dell Vo

ExtremeTech

Today you can save over $400 on a new Dell Alienware M15 R3 that ships with a 4K OLED display and has all the hardware you need to run games with blistering speed and with high quality graphics settings.

Dell Alienware M15 R3 Intel Core i7-10750H 15.6-Inch 4K OLED Gaming Laptop w/ Nvidia GeForce RTX 2070, 16GB DDR4 RAM and 2x256GB M.2 PCI-E SSDs in RAID-0 ($1,869.99)

If you want a fast notebook with plenty of performance for running the latest games, you may want to consider Dell’s Alienware M15 R3. This system was literally built for gaming and it features a fast six-core processor, an Nvidia GeForce RTX 2070 GPU, and a high-quality 4K OLED display that has excellent color accuracy and Tobii eye-tracking technology built-in. The system also has two 256GB NVMe SSDs in RAID-0, which gives you plenty of space for everyday use and faster transfer speeds. You can get this system from Dell marked down from $2,279.99 to just $1,869.99.

Dell Vostro 15 5510 Intel Core i7-11370H 15.6-Inch 1080p Laptop w/ Intel Iris Xe Graphics, 8GB DDR4 RAM and 256GB NVMe SSD ($699.00)

This business class system sports an Intel Core i7 processor that excels at running multiple applications at a time. The laptop is also relatively light at just 3.67 pounds. For a limited time you can get this system from Dell marked down from $1,398.57 to just $699.00.

Apple AirPods Pro ($197.00)

Apple’s AirPods Pro utilizes a new design that’s different from the company’s older AirPod earphones. The key new feature that these earphones have is active noise cancellation. Each earphone also uses a custom driver and a high dynamic range amplifier to improve sound quality. You can snag them with a $52 discount from Amazon that drops the price from $249.00 to $197.00.

Roku Streaming Stick+ 4K HDR Media Player ($39.00)

Roku’s Streaming Stick+ media player features 4K and HDR video support and competes directly with Amazon’s Fire TV 4K. It’s able to stream content from numerous sources including Netflix, Hulu, Vudu, and Sling TV. Right now you can get it from Amazon marked down from $49.99 to $39.00.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 16th 2021, 8:32 pm

Steam Deck Faces Compatibility Problems With Major Titles

ExtremeTech

Valve’s new Steam Deck has already attracted a great deal of attention. The company’s servers aren’t responding particularly well today, because they’re getting slammed with preorders. Would-be buyers should keep in mind a caveat that Valve didn’t prominently disclose when it announced the device: Unless something changes, it won’t be compatible with a number of Steam games, including multiple Top 10 titles, that rely on certain kinds of DRM.

Currently, PUBG, Apex Legends, Rust (for online play on standard servers) Destiny 2, and Rainbow Six: Siege are all incompatible with Proton, the Wine fork Valve developed for its own use. Other incompatible games in the Top 100 include Dayz, Dead by Daylight, Smite, Black Desert Online, Paladins, Fall Guys: Ultimate Knockout, Hunt: Showdown, and Conquerer’s Blade. Players should also be aware that while Wine/Proton offers a ranking system of Native, Platinum, Gold, Silver, Bronze, and Borked, the categories aren’t as clear as they could be.

Native titles are games designed for or ported entirely to Linux. These do not require Proton in the first place. Platinum titles are games that are known to work flawlessly and are fully equivalent to their Windows versions. Everything below that comes with some degree of caveat. Rust apparently works fine if the server you are attempting to join has disabled anti-cheat software, but not at all under normal circumstances.

I have not attempted to game in Steam using Linux, but reading through the various bug reports in titles that do work, it’s clear that the experience can still be rough around the edges. Games like Team Fortress 2 don’t support official servers and play is reported as slower than under Windows.

Valve has stated that the Steam Deck will ship with an updated version of SteamOS and that it is “improving Proton’s game compatibility and support for anti-cheat solutions by working directly with the vendors,” implying that some of these issues may be resolved before launch.

One wonders if these sorts of issues are why Valve will also support Windows on the Steam Deck, even though the device will officially ship with SteamOS. Thanks to the integrated USB-C port, it should be trivial to connect a variety of peripherals to the device, including a thumb drive with a Windows image. Serious gamers will presumably augment their integrated storage with a microSD card, allowing for ample gaming storage even considering Windows 10’s relatively heavy footprint.

There’s reason to think the Steam Deck will work best in the hands of those willing to tinker under the hood. Valve’s battery life numbers specifically refer to the idea of limiting a game to run at 30fps as a means of extending playtime. This is a common trick for boosting battery life, and some of the third-party tools available in Windows (or within Windows drivers) allow gamers to force frame rates in games that don’t support manual adjustment. A willingness to tinker under the hood with specific game settings can also be very useful when attempting to get a title running on lower-end hardware — and an AMD APU is still that, even if it represents significantly more GPU horsepower in a handheld context than we’ve seen any company ship before.

The PC Switch Steam Deck also seems like it’d have some serious handheld emulator chops, though this doesn’t necessarily require Windows. It wouldn’t surprise us, however, if Valve ultimately leans on the idea that Steam Deck owners can install Windows themselves as part of its overall compatibility discussion. The advantage of a device like this is the long tail it offers for PC gaming, vanishing back into the depths of the Steam catalog. The less robust that back catalog is, the less robust the entire system.

Ergonomics, battery life, and compatibility are the three most important areas for Valve to nail in order to deliver a console-like experience. If it can do that, the Steam Deck paired with Windows could be one of the most compatible gaming devices on the planet. It would support Xbox Cloud Gaming, GeForce Now, and Microsoft Game Pass, alongside Sony’s PlayStation Now. Windows would also allow GoG and the Epic Game Store to run on the device.

We’re quite curious to see how SteamOS shapes up in the next 3-4 months and what kind of innovations or improvements Valve can bring to the table for Linux gaming.

Now Read:

July 16th 2021, 5:01 pm

Alder Lake Leak: Intel Core i9-12900K Offers 5.3GHz Boost Clock, Lower PL2

ExtremeTech

Specifications for Intel’s upcoming Alder Lake have leaked, and they imply a CPU that’s at least somewhat more power-efficient than its predecessors without giving up any top-end clock. As always, all leaks should be taken as rumors, not facts.

These leaks, from the Chinese forum Zhihu, suggest the Core i9-12900K will match the 5.3GHz top clock on the Core i9-11900K, with an all-core boost frequency of 5GHz. The CPU’s PL1 is 125W and its PL2 is 228W. The PL1 value is in line with other Intel processors currently in-market, while the PL2 of 228W is down somewhat from the 250W of current Intel chips. This could indicate that Intel has improved top-line power consumption under load compared with its 14nm CPUs, though it’s also possible that Intel tightened the specification for other purposes.


One interesting facet of Intel’s approach to hybrid cores as opposed to what Apple has shown is that Intel appears to be targeting much higher clock speeds for its “little” cores. The 10nm(++) Gracemont cores will top out between 3.4GHz and 3.9GHz. The fact that they clock 15 percent higher on the top-end Intel part could be a clue about where the company expects performance differentiation to show up in its product stack. Pushing up the clock on Gracemont gives Intel a little more headroom on those cores before moving workloads over to Golden Cove.

Apple’s M1 tops out at just 2GHz for its high-efficiency cores. The all-core boost on Golden Cove is 1.28x higher than Gracemont, which means the performance kick from moving a workload to the “big” cores should still be substantial. Golden Cove is expected to be ~1.2x higher IPC than Intel’s previous big-core CPUs, while Gracemont is reportedly targeting Skylake IPC. Actually hitting Skylake IPC would make the performance gap between Gracemont and Golden Cove pretty small. Plenty of people are still using Skylake or earlier CPUs in everyday systems, and Intel’s Kaby Lake, Coffee Lake, and similar CPUs are all effectively Skylake as well, architecturally speaking.

It’s always risky to guess the performance a CPU will offer before launch. Reports have suggested Alder Lake could offer competitive performance with AMD’s Ryzen 5950X. I’m not going to say this cannot happen, but it would require a few things to be true. First, Intel would need to absolutely nail both its IPC improvements and its clock speeds. Alder Lake can’t afford to give back any frequency at all to hit this kind of uplift. Second, it would have to deliver on the idea of Gracemont as a Skylake competitor. Third, it would need to be able to engage both its big and its little cores simultaneously while remaining within its power envelope.

We have seen both Intel and AMD deliver 30-50 percent performance improvements in a single generation before, so I’m not going to call the idea preposterous, but Alder Lake will need to be an exceptional CPU to offer competition against the 16-core Ryzen 5950X. While the two chips are both technically 16-core, Alder Lake’s hybrid model would typically be expected to offer better power efficiency and possibly better responsiveness, but not match AMD in top-end raw performance. Gracemont does not support Hyper-Threading, so its ability to provide an additional kick is likely a bit more limited than what we’d expect from another eight “big” cores.

Alder Lake is expected to debut in late 2021 or early 2022, depending on Intel’s launch schedule. Zen 3 CPUs with large additional pools of L3 cache may arrive in market at roughly the same time. It’s been implied that the additional L3 could offer a ~1.15x performance boost for Zen 3, allowing AMD to deliver roughly one generation’s worth of improvement at the same clock, or potentially offering more power-efficient chips with equivalent performance to the previous generation at lower clocks.

Now Read:

July 16th 2021, 1:59 pm

NASA Identifies Cause of Hubble Failure, Begins Working on a Fix

ExtremeTech

It has now been more than a month since the Hubble Space Telescope reverted to safe mode following an apparent memory failure. NASA attempted all the standard fixes such as turning it off and on, but the aging observatory has stubbornly refused to come back online. After investigating the issue, NASA now believes the failure stems from a power component on the Science Instrument Command and Data Handling module. The team is working to switching Hubble over to its backup computer system, a process that will take several days

The issues started on July 13th when Hubble’s payload computer stopped working. This is separate from the spacecraft’s main computer, so NASA has not lost communication. However, the payload computer controls and coordinates the science instruments on Hubble. Without it, Hubble has no reason to continue operations. When the payload computer failed, the main computer put Hubble in safe mode, and little has changed in the last month. 

Initially, NASA engineers believed the problem was tied to memory modules on the payload computer. The team tried to swap in one of the four spare memory chips, but that didn’t work. Now, NASA says the issue is most likely at a higher level. The payload computer is part of the Science Instrument Command and Data Handling (SI C&DH) unit, which astronauts replaced during Hubble’s final servicing mission in 2009. The information gathered over the last several weeks has convinced NASA that the issue lies with the Power Control Unit (PCU) on the SI C&DH. Unfortunately, NASA has not been able to reset the PSU via ground commands. 

A Hubble SI C&DH unit prior to its installation in the telescope.

There’s still hope, though. Like most expensive hardware destined for space, Hubble was designed with numerous redundancies. The SI C&DH is a two-sided computer, so NASA now plans to switch to the backup side of the system, which has its own components unconnected to the malfunctioning ones. 

The Hubble team has already completed testing and gotten NASA leadership’s approval to switch to the backup system. This process began on July 15th and should be complete in a few days. At that point, we expect Hubble to come back online. NASA has had to rely on backup hardware to keep the telescope operational in recent years, but it’s running out of backups. After 30 years, Hubble is nearing the end of its useful life. It’s already lasted much longer than anyone anticipated, and if we’re lucky, it will share the sky with the James Webb Space Telescope. Webb is scheduled to launch by the end of 2021. Unless it gets delayed again, which would not be surprising at this point.

Now read:

July 16th 2021, 10:16 am

Intel in Talks to Buy GlobalFoundries for $30 Billion: Report

ExtremeTech

The Robert N. Boyce Building in Santa Clara, California, is the world headquarters for Intel Corporation. This photo is from Jan. 23, 2019. (Credit: Walden Kirsch/Intel Corporation)

Intel is said to be in talks to buy GlobalFoundries in a deal worth as much as $30 billion dollars. This would be an enormous acquisition for Intel if the rumor is true. As always, such rumors should be taken with a grain of salt.

The Walt Street Journal is reporting this particular rumor, which comes as GlobalFoundries is planning an IPO. The WSJ notes that whatever Intel is considering, “talks don’t appear to include GlobalFoundries itself as a spokeswoman for the company said it isn’t in discussions with Intel.”

The WSJ goes on to explicitly note that GlobalFoundries might continue with its plans for an IPO, which raises the question of whether these talks are happening or not. One way to make a company look good before an initial public offering is to float rumors of a takeover. Intel has pledged to invest tens of billions of dollars in chip manufacturing since Pat Gelsinger returned to the company as CEO, but buying GlobalFoundries would represent a large additional commitment to the idea of Intel Foundry 2.0.

Let’s assume, for the sake of argument, that the rumors of these talks are real. There are several reasons why Intel might buy GlobalFoundries and several why it might not. On the positive side of things, buying GlobalFoundries instantly gives Intel a foundry customer base and a group of employees who are experienced in running a near leading-edge foundry. GlobalFoundries may not be on the leading edge any longer, but the company’s array of 12nm, 14nm, and 28nm product lines are in hot demand given the global semiconductor shortage.

GlobalFoundries’ Dresden facility from the air.

Buying GF also gives Intel immediate access to foundry facilities that are already tuned for manufacturing for others, as opposed to being designed for Intel silicon built according to Intel design rules. There’s nothing stopping Intel from either overhauling one of its own existing facilities to serve as a client foundry, but overhauls and new construction both take time.

The acquisition would also give Intel some credibility as a foundry provider rather than a manufacturer for its own products first and foremost. When AMD spun GlobalFoundries off into its own company, the firm acquired Chartered shortly thereafter for much the same reason.

Intel’s Fab 42 in Arizona.

As for why Intel wouldn’t buy GlobalFoundries: Much depends on what kind of customers Intel plans to target and how it plans to target them. Intel could build a client foundry business that caters to high-performance customers on the leading edge who want custom process nodes. It could build a business focused on providing low-cost hardware in bulk on older nodes. This type of activity is more often associated with second-tier foundries like UMC, SMIC, and GlobalFoundries than with TSMC or Samsung, but legacy foundries continue to exist because manufacturing on older nodes can be quite profitable in volume.

It can be challenging to mesh corporate cultures during an acquisition and any equipment transfers/installations also take time. It may not be clearly cheaper or faster to buy GF than to build a new facility depending on the type of customers Intel wants to pursue and the benefits it wants to offer.

Buying GF would make Intel and AMD foundry partners, for as long as AMD needs to purchase I/O die and legacy 14nm GPUs. Even this would arguably play to Intel’s advantage. What better way to emphasize the difference between Old Intel and New Intel than to fab for AMD? If Intel wanted to take advantage of another foundry’s position to challenge TSMC, GlobalFoundries is the company they’d probably pick on the basis of size and relative market strength.

Now Read:

July 16th 2021, 10:16 am

Scientists Detect Isotopes on Exoplanet for the First Time

ExtremeTech

The nearest single star to the Sun hosts an exoplanet at least 3.2 times as massive as Earth — a so-called super-Earth. Data from a worldwide array of telescopes, including ESO’s planet-hunting HARPS instrument, have revealed this frozen, dimly lit world. The newly discovered planet is the second-closest known exoplanet to the Earth and orbits the fastest moving star in the night sky.. This image shows an artist’s impression of the exoplanet viewed from space.

Thanks to amazing scientific advancements like NASA’s Kepler Space Telescope and Transiting Exoplanet Survey Satellite (TESS), we have confirmed the existence of thousands of exoplanets. Most of these alien worlds are obscured by the light from their stars, limiting what we can learn about them. For the first time, scientists have managed to detect and analyze the elemental isotopes in an exoplanet’s atmosphere. The team says this could lead to new and better ways of understanding planetary formation. 

The planet in question has a real banger of a name, like most exoplanets. It’s called TYC 8998-760-1 b. Catchy, right? Unlike most exoplanets, this planet is actually visible from Earth. It’s 300 light-years away, but it’s not the distance that makes exoplanets hard to spot — it’s the brightness of their host stars. TYC 8998-760-1 b is something of a perfect storm for visibility. It’s enormous with about twice the diameter of Jupiter and 14 times its mass, and the planet orbits its star at a great distance. You can see a real photo of TYC 8998-760-1 b along with its “c” companion planet below. 

The new study, led by Yapeng Zhang of Leiden University in the Netherlands, leveraged the visibility of this planet to learn about its composition. The team used the European Southern Observatory’s Very Large Telescope in Chile — specifically, an instrument on the VLT called the Spectrograph for Integral Field Observations in the Near Infrared (SINFONI). As a spectrograph (aka a spectrometer), its job is to measure the properties of light. Because TYC 8998-760-1 b is so large and isolated, it reflects a lot of visible starlight — that’s why we can see it. By scanning the exoplanet to see what bits of the optical spectrum are being absorbed, we can determine what’s out there. 

This image, captured by the SPHERE instrument on ESO’s Very Large Telescope, shows the star TYC 8998-760-1 accompanied by two giant exoplanets indicated by arrows, TYC 8998-760-1b (bottom) and TYC 8998-760-1c (top).

According to the researchers, TYC 8998-760-1 b has large deposits of carbon-13, a version of the ubiquitous element with seven neutrons instead of six. The team found about twice as much carbon-13 as you’d normally expect. Those extra neutrons actually tell us something important. TYC 8998-760-1 b is so frigid that it freezes carbon monoxide, and it’s probably always been like that. To have this much carbon-13 in the CO ice, the planet must have formed near its current orbit. It was not closer to the star like all the planets in our solar system are. 

This is important data because our solar system is the only one we’ve been able to study up close. Detecting isotopes on exoplanets can therefore offer a window into their formation, which may be entirely different than what happened here.

Now read:

July 16th 2021, 7:46 am

ET Deals: Dell Vostro 15 5502 Intel Core i5-1135G7 Laptop for only $649, $240 Off 25-Inch Alienware

ExtremeTech

Today you can take home one of Dell’s Vostro 15 5502 laptops with an Intel Core i5 processor and a remarkable discount that drops its price by nearly half.

Dell Vostro 15 5502 Intel Core i5-1135G7 1080p 15.6-Inch Laptop w/ Intel Iris Xe Graphics, 8GB DDR4 RAM and 256GB NVMe SSD ($649.00)

This high quality laptop features a polished aluminum exterior that makes the notebook highly durable and lighter than if it was made out of plastic. The system sports a fast Intel Core i5 processor and Intel’s low-power Iris Xe Graphics that can handle a little light gaming. Typically retailing for $1,170.00 this system is a bit costly, but you can get one now from Dell for just $649.00.

Dell Alienware AW2521HFL 240Hz 1080p 24.5-Inch Gaming Monitor ($269.99)

Enjoy your games to the fullest with a blazing 240Hz monitor! In addition to its extreme refresh rate, this display features support for both FreeSync and G-Sync and a fast 1ms response time giving you a highly responsive gaming experience. Right now you can get this display from Dell marked down from $509.99 to $269.99.

Dell Vostro 3681 Intel Core i5-10400 Desktop w/ Intel UHD Graphics 630, 8GB DDR4 RAM and 256GB NVMe SSD ($549.00)

This compact desktop PC comes equipped with an Intel Core i5-10400 processor that gives it solid performance. In general it would work well as an office PC or home desktop. The system has a 256GB NVMe SSD as it’s main storage device that allows it to boot quickly. It also has a DVD-ROM drive and it has four USB ports on the front to make connecting devices easy. Currently you can get this desktop from Dell marked down from $998.57 to just $549.00.

Insignia NS-43DF710NA21 43-inch 4K UHD HDR Fire TV Edition Smart TV ($239.99)

Insignia’s 43-inch 4K TV features HDR support and Amazon’s Fire TV software. This allows the TV to stream content from various sources, and it enables you to control your TV with voice commands using an included voice remote with Alexa support. Normally this model would cost $319.99, but right now you can get it for $239.99 from Amazon.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 15th 2021, 5:44 pm

This Refurbished and Unlocked Lenovo Tab 4 Plus is On Sale for just $160

ExtremeTech

If you need to replace or buy a new laptop or tablet, it can be easy to become overwhelmed with the options and prices on the market. New tech is undeniably expensive, and sometimes it’s prohibitively expensive. But there are opportunities out there to upgrade your devices for a lower fee if you know where to look. While it’s not purchasing something brand new, finding a quality refurbished unit will allow you to access much improved technology at a lower price point.

Right now, you can get the Lenovo Tab 4 Plus 10.1” 16GB – Black (Refurbished: Wi-Fi + 4G Unlocked) for the reduced price of $159.99, a great discount of 19 percent off the full purchase price of $199. Upgrade your tech at a lower price point with a quality refurbished Lenovo tablet.

The Lenovo Tab 4 Plus is an advanced tablet with a range of useful specifications. it features a crisp and bright 10.1-inch HD display, allowing you to watch videos on Amazon, Hulu, or Netflix in high quality, as well as immersive Dolby Atmos sound, for an even better viewing experience. It also has a light and extremely thin design (at 0.28 inches thick) and has a built-in 1.4GHz Quad-core CPU, a powerful processor which can get things done quicker. The screen also features a filter to protect blue light from affecting your eyes, and the tablet casing has a robust build to help protect it from drops. With a refurbished rating of “B”, you’ll be getting a quality device which may have light scuffing, scratches or dents, but which has been tested and is in good working order.

The Lenovo Tab 4 Plus 10.1” 16GB – Black (Refurbished: Wi-Fi + 4G Unlocked) is now on sale for $159.99, a significant saving of $39.01 off the full purchase price of $199.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 15th 2021, 5:44 pm

Valve Launches Steam Deck, a New PC Gaming Handheld Shipping in December

ExtremeTech

Valve has announced its Steam Deck, a gaming handheld device shipping in December. The device is intended as a Switch-style competitor and would draw attention in this context no matter what. Various companies have attempted to launch a console-style PC — Steam Machines were an arguable attempt to fit into this category, along with various vaporware systems of the past like the Phantom — but no top-tier gaming company or major OEM has built this kind of handheld before.

The Steam Deck will start at $399 for a 64GB option, with a 256GB system available for $529 and a 512GB system for $649. Valve notes that the 512GB option ships with “premium anti-glare etched glass,” the fastest storage options, and an “exclusive carrying case.” Valve notes that “there is no in-game difference in frame rates or graphics quality between the three models.”

Steam deck. Image by Valve.

The handheld is built around an unspecified Zen 2 APU with four cores and eight threads with a 2.4GHz base clock and a 3.5GHz boost clock. This sounds like a custom chip, because Valve also specifies eight GPU compute units clocked between 1GHz and 1.6GHz and power consumption between 4-15W. The Ryzen 7 5700U (1.8GHz base, 4.3GHz boost, 8 CUs @ 1.9GHz) sounds like the closest match for the APU Valve is teasing. If we use this chip for comparison purposes, the Steam Deck would offer faster baseline performance, a lower boost clock, and somewhat lower GPU performance than one of AMD’s upper-end mobile APUs.

The system uses LPDDR5 at 5500MT/s. That’s lower than the current maximum of 6400MT/s, but significantly more bandwidth than the DDR4-3200 we’re used to seeing paired with an APU like this. Memory bandwidth is quite important to APU performance. Assuming that the LPDDR5 subsystem is 128-bits wide — Valve doesn’t say — then this APU would actually enjoy more memory bandwidth than an equivalent desktop or laptop chip. If the bus is narrower, the chip would have less bandwidth than these solutions; the higher clocks will not fully compensate for the reduced bus width.

The display is 7 inches with a resolution of 1280×800 in 16:10 aspect ratios. That’s comparable with the Nintendo Switch, though the aspect ratio is slightly different. The top of the console has two types of shoulder buttons, with two more on the back side. A D-pad, twin thumbsticks, two trackpads, and a standard ABXY button layout are all included. The system also includes 16GB of RAM, a headphone jack, and a microSD slot. The trackpads, according to Valve, offer 55 percent better latency than the Steam controller and are pressure-sensitive for configurable click strength.

Valve claims you can use the device for 7-8 hours when game streaming or playing light titles, 4 hours in a game like Portal 2, and 5-6 hours if you’re willing to play Portal 2 at a capped 30fps. The Steam Deck can reportedly go to sleep and resume quickly, allowing for game session pickups.

The Steam Deck isn’t a Windows device by default. Valve doesn’t disclose details, but it’s running a tweaked version of SteamOS and uses Proton, “a compatibility layer that makes it possible to run your games without any porting work needed from developers.” Proton is a fork of Wine; Valve has released several versions of the software over the last three years. Enthusiasts who want to will be able to dive into the Linux distro underneath the hood, and the device will support peripherals. Hooking a keyboard, mouse, and monitor to the same Steam Deck may require a port multiplier, though, as we only see one USB-C port on the device. Valve notes that the external dock will offer more ports and that a powered USB-C hub can be used instead. If you want Windows on the device, you’ll be able to install it.

Is This Going to Be a Good Investment?

We’re not a big fan of pre-orders at ET, especially when a company is launching a first-generation product and has no prior history in the space. Valve has built some high-quality peripherals in the past, but it hasn’t built anything like this before. You can plunk down cash for a reservation on a system if you want, but we recommend waiting for reviews.

Historically, attempts to build PC handhelds have suffered from limited battery life and high power consumption. Valve may have engineered solutions to this problem by working with AMD to bring a custom APU to market, but it’s not a bad idea to wait and see how the hardware compares first. Valve’s $400 system arguably has too little storage to work effectively as a Steam handheld in the modern era, so we’d like to see more details on whether games can be played from microSD. Even if they can be, anyone looking at the $400 handheld should keep the need for more storage in mind when pricing the system. With its specs and capabilities, the Steam Deck could be a genuinely interesting product when it launches this December.

Pre-orders open tomorrow provided your Steam account has made a purchase on Steam prior to June 2021. A $5 deposit fee is required and only one console can be reserved per account.

Now Read:

July 15th 2021, 4:58 pm

Russia to Build 8-Core RISC-V CPUs for Laptops, Government Systems

ExtremeTech

A RISC-V core prototype. Image by Wikipedia, CC0

RISC-V has attracted a great deal of interest across the computing industry for its open-source instruction set architecture and rapidly evolving ecosystem. There are reports that Russia plans to build new RISC-V based CPUs by 2025. The new chips would be built in a three-way agreement between Rostoc (a technology investment firm), server company Yadro, and the design firm Syntacore. The total cost of the project is estimated at ~$400 million USD or approximately 30 billion rubles.

The goal is to build 60,000 eight-core CPUs clocked at 2GHz for government systems, Anandtech reports. The project budget will be provided by anchor customers and the Russian government, with the latter kicking in 1/3 of the total cost. Syntacore already builds RISC-V CPU clusters, so the 2025 chip would seem to be an evolution of products the company has already brought to market.

The highest-end CPU core Syntacore currently sells is the SCR-7. Details on the chip are limited, but the diagram gives us a few hints. We’re looking at a dual-issue chip with a dedicated L1 between 16KB-128KB, split evenly between instructions and data. There’s an L2 cache of between 128KB and 2MB. This appears to be shared between the entire CPU cluster, which would work out to something between 16KB and 256KB of L2 per core. The pipeline is listed as 10-12 stages and the core is obviously still intended for lightweight workloads, though Linux support is already enabled.

RISC-V made headlines last month after rumors broke that Intel might buy SiFive. SiFive is the highest-performing RISC-V CPU design firm currently in existence, and while its CPU cores are no match for what Intel, AMD, and high-end smartphone vendors ship right now, performance has improved rapidly over the past few years. At present, Intel and SiFive have announced plans to work together, but not a buyout.

Yesterday, computer scientists presenting at the Fifth Workshop on Computer Architecture Research demoed support for OpenCL running on RISC-V, with the goal of allowing new scientific applications to run on the architecture. RISC-V has also gotten some attention in China, where it has reportedly been seen as a way to safeguard against the IP controls the United States leverages through the Entity List. Organizations and individuals on the Entity List “are subject to specific license requirements for the export, re-export and/or transfer (in-country) of specified items.” China has also showcased its XiangShan RISC-V CPU, which is said to be built on 28nm and hits clock speeds of 1.3GHz. China wants to compete against the Cortex-A76 with the CPU, but that’s going to be an iterative process over several years. It’ll also require access to more advanced processing nodes than 28nm.

Anandtech believes the new Russian core will be built at GlobalFoundries on that company’s 12nm process. An 8-core, 2GHz CPU by 2025 is not a tremendous competitor for what Intel, AMD, and ARM will be shipping by then, but it takes time to build a homegrown microprocessor. Russia may want to decrease the risk that its own access to semiconductors could be impeded by future clashes with the United States. Investing in RISC-V is an obvious way to mitigate that risk.

Feature image: A RISC-V core prototype. Image by DCoetzee, Wikipedia, CC0

Now Read:

July 15th 2021, 3:58 pm

A Piece of Paper Might Fix JoyCon Drift

ExtremeTech

JoyCon drift has been a problem for the Switch ever since the device launched. While it’s done absolutely nothing to dent the hardware’s meteoric popularity, it’s been an annoyance and an unwanted additional cost for an unknown number of players since the system debuted. The problem is magnified on the Switch Lite because that console can’t swap JoyCons for a new set. Nintendo has faced multiple class-action lawsuits worldwide, though the company has steadfastly refused to fix the problem. The upcoming OLED-based Switch, for example, is already known to use the same JoyCon hardware that the 2017-era system does.

JoyCon drift is the name for a specific behavior in which the controller relays input without the end-user intending it. In severe cases, a JoyCon might turn perpetually in one direction or another, forcing the player to keep pressure on the stick to prevent or slow it. This can range from an annoyance to a game-breaking problem depending on the title and the severity of the issue.

There have been multiple investigations into what causes JoyCon drift and how to repair it. Most fixes focus on cleaning out the joystick to remove grime and dirt that can build up on the contacts that sense movement. This can help fix the problem, but the issue can return again after a period of time. Replacing the thumbsticks is also known to work, though this is more difficult than just cleaning out the mechanism.

According to YouTuber VK, the problem has an additional cause. Over time, according to VK, the clamps holding joystick components together loosen. This allows gaps to form between the metal contacts inside the controller and the graphite pads those contacts, erm, make contact with. According to his channel, squeezing the middle of a JoyCon controller can restore connection and temporarily stop JoyCon drift. For a more permanent solution, he recommends inserting a piece of paper 1mm thick — something like a business card. The thicker stock probably helps the repair hold for longer.

According to VK, this fix can work even if there’s dirt and dust built up between the contacts and the pads, meaning it isn’t necessary to keep cleaning out the JoyCon to keep drift at bay. He claims Nintendo could fix the problem by adding a single screw to the existing design. According to him, inserting a 1mm thick piece of paper stock has kept his own Switch drift-free for over two months.

We’re not going to claim that this definitively fixes the Switch on the basis of any single repair method, but it’s not impossible that it could. If it does, it raises serious questions about what kind of investigation Nintendo actually conducted into the problem and how seriously it has attempted to fix it. A paper shim and possibly a single additional screw to fix a problem long-term are not major additions or serious cost adders. Even if paper winds up being subpar and a poor long-term choice, it’s hard to imagine there’s no thin plastic insert that could accomplish the same task. Nintendo launched the Switch over four years ago and has since followed it up with an improved model with a better battery, the Switch Lite, and the OLED Switch that’ll launch later this year. There’s been more than enough time for someone to notice that 10 cents of additional hardware could permanently solve a problem.

Nintendo has offered free repairs to North American customers since the Switch’s JoyCon problem became public, but it’s never fixed the issue. If it turns out the flaw can be corrected this easily, it’s going to raise questions as to whether the company ever even bothered to try. That’s one reason we’re cautious about concluding that this repair will definitively fix the issue. If it does turn out to be this simple, Nintendo has some explaining to do.

Now Read:

July 15th 2021, 12:31 pm

The Worst CPUs Ever Made

ExtremeTech

After we covered the worst storage mediums ever, it’s now time to revisit some of the worst CPUs ever built. To make it onto this esteemed list, a CPU needed to be fundamentally broken, as opposed to simply being poorly positioned or slower than expected. The annals of history are already stuffed with mediocre products that didn’t quite meet expectations but weren’t truly bad.

Note: Plenty of people will bring up the Pentium FDIV bug here, but the reason we didn’t include it is simple: Despite being an enormous marketing failure for Intel and a huge expense, the actual bug was tiny. It impacted no one who wasn’t already doing scientific computing and the scale and scope of the problem in technical terms was never estimated to be much of anything. The incident is recalled today more for the disastrous way Intel handled it than for any overarching problem in the Pentium microarchitecture.

We also include a few dishonorable mentions. These chips may not be the worst of the worst, but they ran into serious problems or failed to address key market segments.

Intel Itanium

Intel’s Itanium was a radical attempt to push hardware complexity into software optimizations. All of the work to determine which instructions to execute in parallel was handled by the compiler before the CPU ran a byte of code. Analysts predicted Itanium would conquer the world. It didn’t. Compilers were unable to extract necessary performance and the chip was radically incompatible with everything that had come before it. Once expected to replace x86 entirely and change the world, Itanium limped along for years with a niche market and precious little else.

Itanium’s failure was particularly egregious because it represented the death of Intel’s entire 64-bit strategy (at the time). Intel had originally planned to move the entire market to IA64 rather than extend x86. AMD’s x86-64 (AMD64) proved quite popular, partly because Intel had no luck bringing a competitive Itanium to market. Not many CPUs can claim to have failed so egregiously they killed their manufacturers’ plans for an entire instruction set.

 

Intel Pentium 4 (Prescott)

Prescott doubled down on the P4’s already-long pipeline, extending it to nearly 40 stages, while Intel simultaneously shrank the P4 to a 90nm die. This was a mistake. The new chip was crippled by pipeline stalls that even its new branch prediction unit couldn’t prevent and parasitic leakage drove high power consumption, preventing the chip from hitting the clocks it needed to be successful. Prescott and its dual-core sibling, Smithfield are the weakest desktop products Intel ever fielded relative to its competition at the time. Intel set revenue records with the chip, but its reputation took a beating.

AMD Bulldozer

AMD’s Bulldozer was supposed to steal a march on Intel by cleverly sharing certain chip capabilities to improve efficiency and reduce die size. AMD wanted a smaller core, with higher clocks to offset any penalties related to the shared design. What it got was a disaster. Bulldozer couldn’t hit its target clocks, drew too much power, and its performance was a fraction of what it needed to be. It’s rare that a CPU is so bad, it nearly kills the company that invented it. Bulldozer nearly did. AMD did penance for Bulldozer by continuing to use it. Despite the cores flaws, it formed the backbone of AMD’s CPU family from late 2011 through early 2017.

Cyrix 6×86

Cyrix was one of the x86 manufacturers that didn’t survive the late 1990s (VIA now holds their x86 license). Chips like the 6×86 were a major part of the reason why. Cyrix has the dubious distinction of being the reason why some games and applications carried compatibility warnings. The 6×86 was significantly faster than Intel’s Pentium in integer code, but its FPU was abysmal and its chips weren’t particularly stable when paired with Socket 7 motherboards. If you were a gamer in the late 1990s, you wanted an Intel CPU but could settle for AMD. The 6×86 was one of the terrible “everybody else” chips you didn’t want in your Christmas stocking.

The 6×86 failed because it couldn’t differentiate itself from Intel or AMD in a way that made sense or gave Cyrix an effective niche of its own. The company tried to develop a unique product and wound up earning itself a second place on this list instead.

Cyrix MediaGX

The MediaGX was the first attempt to build an integrated SoC processor for desktop, with graphics, CPU, PCI bus, and memory controller all on one die. Unfortunately, this happened in 1998, which means all those components were really terrible. Motherboard compatibility was incredibly limited, the underlying CPU architecture (Cyrix 5×86) was equivalent to Intel’s 80486, and the CPU couldn’t connect to an off-die L2 cache (the only kind of L2 cache there was, back then). Chips like the Cyrix 6×86 could at least claim to compete with Intel in business applications. The MediaGX couldn’t compete with a dead manatee.

The entry for the MediaGX on Wikipedia includes the sentence “Whether this processor belongs in the fourth or fifth generation of x86 processors can be considered a matter of debate.” The 5th generation of x86 CPUs is the Pentium’s generation, while the 4th generation refers to 80486 CPUs. The MediaGX shipped in 1997 with a CPU core stuck somewhere between 1989 and 1992, at a time when people really did replace their PCs every 2-3 years if they wanted to stay on the cutting edge. It also notes, “The graphics, sound, and PCI bus ran at the same speed as the processor clock also due to tight integration. This made the processor appear much slower than its actual rated speed.” When your 486-class CPU is being choked by its own PCI bus you know you’ve got a problem.

Texas Instruments TMS9900

The TMS9900 is a noteworthy failure for one enormous reason: When IBM was looking for a chip to power the original IBM PC, they had two basic choices to hit their own ship date — the TMS9900 and the Intel 8086/8088 (the Motorola 68K was under development but wasn’t ready in time). The TMS9900 only had 16 bits of address space, while the 8086 had 20. That made the difference between addressing 1MB of RAM and just 64KB. TI also neglected to develop a 16-bit peripheral chip, which left the CPU stuck with performance-crippling 8-bit peripherals. The TMS9900 also had no on-chip general purpose registers; its 16 16-bit registers were all stored in main memory. TI had trouble securing partners for second-sourcing and when IBM had to pick, it picked Intel. The rest is history.

Dishonorable Mention: Qualcomm Snapdragon 810

The Snapdragon 810 was Qualcomm’s first attempt to build a big.Little CPU and was based on TSMC’s short-lived 20nm process. The SoC was easily Qualcomm’s least-loved high-end chip in recent memory — Samsung skipped it altogether and other companies ran into serious problems with the device. QC claimed that the issues with the chip were caused by poor OEM power management, but whether the problem was related to TSMC’s 20nm process, problems with Qualcomm’s implementation, or OEM optimization, the result was the same: A hot-running chip that won precious few top-tier designs and is missed by no one.

Dishonorable Mention: IBM PowerPC G5

Apple’s partnership with IBM on the PowerPC 970 (marketed by Apple as the G5) was supposed to be a turning point for the company. When it announced the first G5 products, Apple promised to launch a 3GHz chip within a year. But IBM failed to deliver components that could hit these clocks at reasonable power consumption and the G5 was incapable of replacing the G4 in laptops due to high power draw. Apple was forced to move to Intel and x86 in order to field competitive laptops and improve its desktop performance. The G5 wasn’t a terrible CPU, but IBM wasn’t able to evolve the chip to compete with Intel.

Dishonorable Mention: Pentium III 1.13GHz

The Coppermine Pentium III was a fine architecture. But during the race to 1GHz against AMD, Intel was desperate to maintain a performance lead, even as shipments of its high-end systems slipped further and further away (at one point, AMD was estimated to have a 12:1 advantage over Intel when it came to actually shipping 1GHz systems). In a final bid to regain the performance clock, Intel tried to push the 180nm Cumine P3 up to 1.13GHz. It failed. The chips were fundamentally unstable and Intel recalled the entire batch.

Dishonorable Mention: Cell Broadband Engine>

We’ll take some heat for this one, but we’d toss the Cell Broadband Engine on this pile as well. Cell is an excellent example of how a chip can be phenomenally good in theory, yet nearly impossible to leverage in practice. Sony may have used it as the general processor for the PS3, but Cell was far better at multimedia and vector processing than it ever was at general purpose workloads (its design dates to a time when Sony expected to handle both CPU and GPU workloads with the same processor architecture). It’s quite difficult to multi-thread the CPU to take advantage of its SPEs (Synergistic Processing Elements) and it bears little resemblance to any other architecture.

What’s the Worst CPU Ever?

It’s surprisingly hard to pick an absolute worst CPU. Is it more important that a CPU utterly failed to meet overinflated expectations (Itanium) or that the CPU core nearly killed the company that built it (Bulldozer)? Do we judge Prescott on its heat and performance (bad, in both cases) or on the revenue records Intel smashed with it?

Evaluated in the broadest possible meanings of “worst,” I think one chip ultimately stands feet and ankles below the rest: The Cyrix MediaGX. It is impossible not to admire the forward-thinking ideas behind this CPU. Cyrix was the first company to build what we would now call an SoC, with PCI, audio, video, and RAM controller all on the same chip. More than 10 years before Intel or AMD would ship their own CPU+GPU configurations, Cyrix was out there, blazing a trail.

It’s unfortunate that the trail led straight into what the locals affectionately call “Alligator Swamp.”

Designed for the extreme budget market, the Cyrix MediaGX appears to have disappointed just about anyone who ever came in contact with it. Performance was poor — a Cyrix MediaGX 333 had 95 percent the integer performance and 76 percent of the FPU performance of a Pentium 233 MMX, a CPU running at just 70 percent of its clock. The integrated graphics had no video memory at all. There’s no option to add an off-die L2 cache. If you found this under your tree, you cried. If you had to use this for work, you cried. If you needed to use a Cyrix MediaGX laptop to upload a program to sabotage the alien ship that was going to destroy all of humanity, you died.

All in all, not a great chip.

Now Read:

July 15th 2021, 12:31 pm

PC Gaming Hardware Sales May Be Poised to Explode

ExtremeTech

Ever since the pandemic and cryptocurrency markets teamed up to ambush Ampere in a dark alley, there have been concerns about what another long period of unaffordable hardware would do to the PC gaming industry. A new report from Jon Peddie Research echoes these concerns but also assuages them, predicting that the industry is on the cusp of a tremendous upgrade cycle as stymied gamers finally buy into hardware that’s been priced out of reach for nearly a year.

As JPR notes, the recent boom in sales has been great for the resellers who captured the majority of profits, but less so for the gamers who actually wanted to buy hardware. They argue that this is finally changing, thanks to the collapse of cryptocurrency mining in China and improved hardware availability.

“PC gaming hardware companies are reviewing their just-in-time strategies and beginning to adopt just-in-case inventory levels,” says Ted Pollak, JPR’s senior game tech industry analyst. “As a result, we expect inventory and sales of high-end products to grow dramatically in the coming years. Additionally, ultrawide and 4K+ displays are now available at big box stores and online for historically low prices. This helps drive CPU and GPU demand, full builds, and accessory sales to gamers,.”

JPR’s prediction here runs counter to data we’re seeing in the wider PC market. According to recent reports from IDC, Gartner, and Bloomberg, there are reasons to think the PC bull market as a whole is finally cooling off. Bloomberg notes that electronics spending growth had jumped by 40 percent in March of 2021 compared with the previous year. For June, the equivalent figure is just 20 percent. IDC reports that demand for PCs in Q2 2021 was up 13.1 percent compared with the year-ago period — but that this has sharply tapered compared with the 55.9 percent growth rate seen in Q1 2021 and the 25.8 percent growth rate of Q4 2020.

PC sales in Q2 2020 were only up 2.8 percent over Q2 2019, so this is not a case where a surge in sales last year is a continuation of otherwise excellent growth. Desktop growth actually outpaced laptops last quarter due to component shortages. Desktop sales fell badly during the pandemic relative to notebooks, so this is an interesting reversal. There’s evidence that Chromebooks continue to account for a large percent of the total. Gartner, which does not count Chromebook sales, reported a year-on-year sales increase of 4.6 percent. IDC, which does include Chromebooks in the PC market, reported an increase of 13.2 percent.

Gartner and IDC are both calling for caution as regards the future of the PC market, after predicting that robust demand would continue for the foreseeable future earlier this year. JPR’s prediction of high gaming PC demand doesn’t necessarily contradict what the other research firms are predicting, however. Gamers have always been a specific, niche market and many never got a chance to upgrade to Ampere when it was new. It’s possible that we’ll see PC sales decline back towards the historical average while gamers kick off a robust upgrade cycle for the next 12-24 months. This depends on GPU prices remaining sane for that length of time — something they haven’t actually been very good at this past five years.

Now Read:

July 15th 2021, 12:31 pm

Astronomers Successfully Measure the Motion of 66 Nearby Galaxies

ExtremeTech

The heavens appear to move throughout the year, but that’s just because Earth is moving. The real or “proper motion” of stars and galaxies was beyond our ability to measure until fairly recently, but bigger, more powerful telescopes have helped paint a picture of an evolving universe. The European Space Agency’s Gaia space telescope has been particularly valuable in tracking the motion of stars inside our galaxy. But what about other galaxies? A team of astronomers has used Gaia to get insight into our celestial neighbors, determining the proper motion of a whopping 66 nearby galaxies. 

Our galaxy, the Milky Way, is one of two large galaxies in the Local Group. The other is Andromeda, which is a galaxy with well-understood proper motion. Unlike most objects in the universe, it’s moving toward us. In a few billion years, Andromeda will collide with the Milky Way to form a giant elliptical galaxy. Until now, we didn’t have the ability to measure the proper motion of the smaller galaxies in the Local Group. A new study from a team of Spanish researchers used Gaia to scan 74 dwarf galaxies in and around the Local Group. For most of them, the team was able to determine reliable proper motions, reports Bad Astronomy

This wasn’t as simple as just pointing Gaia at different galaxies and taking a picture. Gaia has revolutionized our understanding of stellar motion in the Milky Way because it has the ability to monitor the sky with high precision over long periods of time. The mission has a planned duration of five years, but the 34 months of data we have now has revealed proper motion for more than a billion stars. It took a long time to tease the necessary details out of the Gaia data for the more distant dwarf galaxies. For example, astronomers had to assess the color spectrum of stars to see if they matched the expected spectrum of the target galaxies. 

The approximate locations of Local Group galaxies, Milky Way in the center.

The results are a watershed moment for our understanding of the Local Group. We’ve learned that six of these dwarf galaxies are actually gravitationally bound to the Large Magellanic Cloud, which itself has about ten percent the mass of the Milky Way. This data has also helped to calculate the orbits of the dwarf galaxies. Since the mass of the Milky Way is still uncertain, the team used 900 billion and 1.6 trillion solar masses as educated guesses to estimate the future motion of the Local Group. 

They found some of those 66 galaxies are just passing through—they’ll exit the Local Group eventually, so perhaps we shouldn’t consider them part of it. Some others are also likely to fall into the Milky Way. The smaller orbiting dwarf galaxies are more likely to land near the center of our galaxy, increasing the likelihood they’ll be ripped apart. This is a lot of new info about our galaxy’s nearest neighbors, and Gaia isn’t even done with its initial survey yet.

Now read:

July 15th 2021, 12:31 pm

DRAM Price Increases in Q3 May Be Smaller Than Expected

ExtremeTech

Good news for anyone anticipating buying DRAM in the next few months. While DRAM prices spiked in Q2 2021 thanks to high demand and limited supply, the bulk of the price increase is thought to be behind us. While Trendforce expects prices to increase in Q3 2021, the jump should be smaller than typical.

Trendforce now predicts DRAM contract prices to grow by 3-8 percent in Q3, down from 18-23 percent in Q2 2021. The company also believes that DRAM supply will continue to grow in Q3 and Q4 of this year, which should limit further price increases.

While mobile demand has been aggressive, PC manufacturers are holding a consistent 8-10 weeks of inventory. Trendforce notes that this is fairly high and believes manufacturers will be relatively conservative in their DRAM purchases as a result. The reason we’re still seeing an overall supply crunch is that server DRAM demand is rising, but not enough to offset the relatively high levels of inventory also being carried by server manufacturers. As a result, server DRAM prices are expected to grow by 5-10 percent in Q3, in line with PC DRAM estimates.

The mobile DRAM market (this appears to mostly mean smartphones) is in a different shape. The company notes that mobile DRAM pricing will “defy market realities and increase by 5-15 percent QoQ, with potential risks of high price and low demand.”

GDDR6 prices are still expected to increase 8-13 percent next quarter, in line with reports from early June. Trendforce notes that demand for GDDR6 still vastly outstrips supply and that fully 90 percent of graphics DRAM products have migrated to the new memory standard. Memory manufacturers are expected to prioritize server DRAM requirements first and foremost, so graphics DRAM prices will increase as well.

News of modest GPU VRAM price increases will be met with rejoicing — or at least something near enough to it — provided GPU prices come down overall. The price of new GPUs has been as high as 300 percent above MSRP in recent months. China’s crackdown on cryptocurrencies has sent demand for new GPUs plunging and we’re hoping to see evidence of cheaper GPU prices in weeks to come. An 8-13 percent increase in VRAM price is literally a small price to pay compared with the sky-high cost of new cards since last fall.

The implication of these trends, in aggregate, is that PC hardware prices should start stabilizing over the next six months. We’re still in a situation where demand is likely to outstrip supply, but no one is forecasting additional demand surges at the moment. Chip production is slowly increasing quarter-on-quarter as manufacturers bring more capacity online, and this will help buffer the seasonal increases in demand that normally occur in the back half of the year. Trendforce does not mention DDR5 at all in its statements, but we should see early DDR5 shipments by the end of the year. This may not have much impact on the total market, but it may take some pressure off the demand for DDR4, depending on just how popular these systems are.

Feature image by Pete, Flickr, CC BY-SA 2.0

Now Read:

July 14th 2021, 5:01 pm

Learn Unity and Unreal Game Development for $30 with this Expert Led Training

ExtremeTech

The game development industry is rapidly developing, both in size and sophistication, but also gaining more respect as a storytelling medium. The diverse roles and massive teams that are required to cooperate to create even simple titles mean that each game released is the product of a Herculean effort to which hundreds or even thousands of people have contributed – all for the purpose of stellar entertainment and storytelling. The industry is growing rapidly across sectors such as console play and mobile game development, so it’s a canny move to switch careers or develop your skills to translate into game development.

The Unreal & Unity Game Development for Beginners Bundle is a great place to start with learning game development, and the e-learning course bundle is currently on sale for the reduced price of $30 – a great saving when you consider that it offers a comprehensive grounding in game development that can kickstart your career with an educational boost.

The bundle includes six essential game development courses split into 284 easily digestible lessons spanning 50 hours of practical content aimed at taking you from beginner to knowledgeable practitioner. Start with the Unity Android 2021 course, giving you a grounding in game development for the Google operating system, and build seven Android games with Unity and C#. Continue with learning through practical application by completing many mini-tasks with Unity, and stretch your skills further by building an Action RPG in Unreal Engine – the same game engine used in popular games such as Fortnite and Star Wars Jedi: Fallen Order.

The courses are taught by highly rated instructors from Zenva Academy (4.4 stars out of five), an e-learning platform used by over 500,000 learners, and by Raja Biswas (4.4 stars out of five), the founder of Charger Games and a passionate instructor helping others to publish their own games.

The Unreal & Unity Game Development for Beginners Bundle on sale for the reduced price of $30, meaning each included course is just $5, fantastic value for a comprehensive grounding in Unreal and Unity game development.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 14th 2021, 5:01 pm

ET Deals: $20 Off Apple MacBook Pro M1 Laptop, Dell 27-Inch 4K Monitor for $299

ExtremeTech

Apple’s MacBook Pro laptops equipped with the company’s revolutionary M1 chip offer excellent performance for a wide range of tasks, and you can get one now with an $200 discount.

Apple MacBook Pro M1 Chip 13.3-Inch Laptop w/ 8GB RAM and 256GB SSD ($1,099.99)

Apple’s latest MacBook Pro comes equipped with Apple’s new M1 SoC, which contains an 8-core processor that’s reportedly 3.5 times faster than the hardware inside of the preceding model. Apple said the system can also last for up to 20 hours on a single charge, giving you all the power you need to work from sunrise to sunset. Now for a limited time you can get one of these systems from Amazon marked down from $1,299.00 to just $1,099.99.

Dell S2721QS 27-Inch 4K IPS Monitor ($299.99)

Dell’s S2721QS is a 27-inch monitor that sports a 4K IPS panel with HDR and FreeSync support. The monitor can also be used for detailed video editing work as it covers 99 percent of the sRGB color gamut, and it also has a built-in pair of 3W speakers. Currently Dell is selling these monitors marked down from $489.99 to $299.99.

LG Gram 15Z90N Intel Core i7-1065G7 15.6-Inch 1080p IPS Laptop w/ 8GB DDR4 RAM and 256GB NVMe SSD ($999.00)

LG’s 15.6-inch Gram laptop is a lightweight notebook that weighs a meager 2.5lbs, which makes it easy to carry around to classes and meetings. The notebook also has exceptional battery life that can last for up to 17 hours on a single charge. Performance should also be strong as the notebook has an Intel Core i7-1065G7 processor with 8GB of RAM. For a limited time you can get one of these notebooks from Amazon marked down from $1,599.99 to just $999.00.

Western Digital My Passport 1TB External SSD ($147.49)

This external drive utilizes SSD technology to enable relatively fast data transfer speeds of up to 1,050MB/s over a USB 3.2 Gen 2 connection. It can also hold up to 1TB of data and it has built-in security features including the ability to set a password on the drive and 256-bit AES hardware encryption. The drive also has a durable metal enclosure as well as a small physical footprint and can be picked up from Amazon marked down from $199.99 to just $147.49.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 14th 2021, 5:01 pm

Microsoft Brings Windows to the Cloud With Windows 365

ExtremeTech

Microsoft is taking a big step today with Windows 365. No, it hasn’t skipped from Windows 11 all the way to 365. This is Microsoft’s new cloud PC platform, which will allow businesses to run Windows desktops online for easy access from myriad devices. Virtualization and remote access are nothing new for Microsoft, but Windows 365 aims to streamline the process with unified management, pre-configured systems, and more for a monthly fee. We don’t know how much it will cost, but Microsoft says Windows 365 scales from small to large businesses. 

Windows 365 will work with any modern web browser or via the Microsoft Remote Desktop app. Of course, if you’ve got Remote Desktop, you’re already on Windows. So, why would you want to access a cloud version of Windows on top of that? Microsoft presents this as a way for businesses to make important tools and data accessible from anywhere. Integrated analytics and diagnostics should make managing a large number of users on Windows 365 easier, too. Windows 365 clients even show up right next to physical computers in the Microsoft Endpoint Manager. 

On a supported device, Windows 365 will launch instantly and provide access to all the same content no matter which screen you’re using. In one click, employees are inside a managed, secure system without mucking around with cloud storage boxes or VPNs. According to Microsoft, Windows 365 will work on Mac, iPad, Linux, and Android — notice anything missing? Microsoft seems to have avoided calling out Chrome OS as a supported platform. It’s unclear if that’s simply because Microsoft doesn’t want to mention the OS that’s eating its lunch at the entry-level. Chrome OS is primarily a browser, so it would be odd for the company to block it. 

Businesses will be able to use Windows 365 to spin up different PCs (Windows 10 for now and Windows 11 when it launches) for different use cases. For simple office work, there will be a base config with a single CPU, 2GB of RAM, and 64GB of storage. The 12 available configurations go all the way up to a monster cloud PC with eight CPU cores, 32GB of RAM, and 512GB of storage. 

Microsoft is only talking about Windows 365 for businesses, but this seems like the kind of service that could easily branch out to consumers. I’m sure plenty of people would be interested in running a cloud version of Windows 11 on an iPad or Android foldable. Add in some respectable gaming hardware, and Microsoft could really be onto something. We can only hope.

Now read:

July 14th 2021, 5:01 pm

You Can Now Pre-Order a $17,500 3D Holographic Monitor, Not That You Should

ExtremeTech

The screens you look at all day are probably good old 2D panels. TVs and phones had a brief flirtation with 3D some years back, but the benefits were outweighed by the cost and other drawbacks. 3D still exists in certain niches, though. A company called Looking Glass Factory has just unveiled its second-generation holographic displays. They aren’t cheap, but the technology is mature enough that designers or animators might actually want to drop the cash on one. 

The new holographic displays come in two versions: a 4K panel at 15.6 inches and an 8K at 32 inches. You don’t need glasses to view the 3D holographic images, and there’s no “sweet spot” like some past glasses-free 3D systems. Looking Glass creates images that appear to have depth by projecting 45 to 100 simultaneous views of the same image. These images are all presented at slightly different angles, so you can move around, and the 3D illusion follows you around. 

The second-gen holographic screens improve on the original with a “blockless” design. That means the screen itself is thinner, and the 3D effect is more pronounced. The images appear to float outside of the screen in real life. The video below will help you visualize the perspective-shifting effects, but you can’t truly appreciate 3D unless you’re looking at the genuine article. The optics in the Gen 2 panels have also improved, reducing reflections and making it easier for groups to gather around the holograms. 

Looking Glass’ new 4K screen is expensive even for a professional tool at $3,000, but that’s not unheard of for high-end monitors. The 8K Gen 2 is a tougher sell at $17,500. Both Gen 2 panels are aimed at the enterprise, but the 8K especially. No one’s spending that kind of cash on a screen just to tinker with holograms. For consumers, Looking Glass launched a tiny 7.9-inch portrait holo monitor in December. It retails for $299, but it doesn’t have all the features of the larger ones. 

You will need a moderately powerful video card to use the Gen 2 holographic displays, and that can be a tall order these days with the ongoing GPU shortage. The 4K monitor requires at least an Nvidia RTX 1060, and the 8K version needs an RTX 3090. If you don’t have one, get ready to tack on another $2,000-3,000. Interested parties can pre-order the screens now for shipping later this year. Maybe you’ll be able to buy a GPU for less than a small fortune by then.

Now read:

July 14th 2021, 9:49 am

The Worst Storage Mediums of All Time

ExtremeTech

Credit: KMJ, Published under the GNU Free Documentation License

The history of writing and symbolic communication is a fascinating and rich topic — but not a simple one. Different groups of people have developed a wide range of communication methods depending on where and how they lived. Furthermore, it’s not exactly fair to judge prehistoric art against a modern SSD in terms of information density. For this reason, we’ve looked for mediums that were objectively bad at doing what they did — not to insult the people who invented them, but to illustrate that ancient peoples left these approaches behind for a reason.

Cave Paintings and Petroglyphs

I feel a little bad picking on cave paintings and petroglyphs. They’re the oldest known forms of symbolic communication and everybody has to start somewhere. The problem with cave paintings and petroglyphs as a storage medium is twofold:

A petroglyph of bighorn sheep from the American Southwest. Photo by Jim Bouldin, CC BY-SA 3.0

1). They contain little information modern scholars can use to place them in a historical or cultural context.
2). They’re not exactly portable.

It’s obvious that petroglyphs and cave art had meaning to the people who created them. Not many people are interested in exploring pitch-black caves with primitive torches to find the best place to paint ibex in just the right shade of ochre. The problem is that, in many cases, we have no idea what those meanings really were. In some cases, people painted the plant they were getting high on at the time, but we don’t know the meaning they ascribed to the ritual.

Sundials, I admit, aren’t exactly a storage medium for time, but stretch the metaphor. Full-size sundials are terribly inconvenient watches.

As writing evolved, we gained the ability to record contextualizing information directly into the narrative, with images serving as illustrations of described events rather than containing the entirety of the event within themselves. We also figured out how to use less durable media. Advantage? Written communication becomes possible without hauling a 20-ton slab of granite around with you. Disadvantage? Keep reading.

Clay Tablets

Clay tablets (often written in cuneiform) have some serious advantages over large slabs of granite. They’re infinitely more portable and they can be quickly etched with a stylus as opposed to requiring a chisel. There’s probably a reason why the earliest known customer complaint is written on a clay tablet. Nobody, not even the cursed customers of Ea-nasir, had the patience to chisel a complaint out of a mountain. Cuneiform made that sort of thing easy. Cuneiform also beats petroglyphs in that you can actually pay to have a tweet scratched into a tablet and mailed to your house. Now you, too, can commemorate your own witty sayings with an artifact that looks like it has part of the Epic of Gilgamesh scratched into it.

The good news is, you’ll be remembered. The bad news is, it’ll be as a cautionary tale. Image by Zunkir, Wikipedia, CC BY-SA 4.0

Cuneiform tablets beat out giant rocks, but they fail against most other concepts. There’s no such thing as a durable cuneiform tablet anything near the thickness of paper or hide, so the medium was bulky and thick. Cuneiform data density was low. This entertaining estimate suggests data density for clay tablets was roughly 2.6 words per cm2 or ~3000 words per kilogram of clay. This compares quite poorly to books as far as density and weight, to say nothing of how it would compare with more modern methods of data storage.

The one thing clay tablets do have going for them is durability under certain circumstances. Unlike scrolls, which were quite fragile in all circumstances, clay tablets weathered the millennia reasonably well when protected. In some cases, clay tablets held in ancient royal palaces survived the destruction and sacking of their cities because torching a library full of clay tablets often bakes them, helping to preserve them for posterity. Pour one out for the librarians of Ashurbanipal, who were obviously doing the best they could at the time.

Scrolls (Particularly in the Mediterranean)

Think about the characteristics of a good storage medium. It should be physically tough, able to suffer an indignity or two without immediately collapsing. It ought to be efficient and allow for maximum use of the supplied material. It should withstand the ambient effects of the climate on its own long-term storage. It should facilitate easy copying and searching of its own contents.

Scrolls utterly fail at accomplishing most of these things, particularly outside Egypt and the Middle East. Confined to hot, dry, desert conditions, scrolls can last for thousands of years, which is why we have artifacts like the Dead Sea Scrolls and Egyptian papyrus dating back to the reign of Khufu some 4,600 years ago. It’s also why we don’t have anything comparable from the time of the Romans or Greeks, who lived thousands of years closer to our own era.

Scrolls are fragile. They can only be written on a single side and were typically read vertically, not horizontally. A single scroll could be up to 10 meters (33 feet) long, and because they were rolled up, a scribe couldn’t hold the text and take notes or make copies simultaneously. They made no use of indents or page breaks, making them impossible to bookmark. Finally, they rot over time due to ambient exposure to the humidity of Mediterranean climates. If you’re going to invent a storage medium that’s difficult to copy, it’s best to invent one that doesn’t require regular copying to maintain the information within.

The only surviving library in all of antiquity to make it to the modern era was found at Herculaneum, after being buried by a volcanic eruption in 79 AD. The scrolls — which now resemble carbonized dog turds — were first burned, thrown away, and destroyed until someone recognized a scrap of writing. We have recovered 1,826 papyri in total, of which ~340 are “almost complete,” 970 are “decayed and partly decipherable,” and more than 500 are merely charred fragments. There may yet be further buried papyri on the site.

The Herculaneum papyri look like this, when they’re still intact at all.

Think about that for a moment. We have less than 1 percent of the writings of antiquity, and one of the major reasons we lack so much is because, prior to finding the Villa of the Papyri, we had no surviving scrolls from this time period at all.

Humans moved on to codices (covered in our companion story) almost immediately after they were invented, because books — or even primitive, book-like objects — are so much better than scrolls.

Just to be clear: Paper, papyrus, and the like are some of the all-time best storage mediums of all time, but not when stored as scrolls. We had to find a different method of collating information and binding it together before we could make full use of this remarkable invention. Thus, paper is amazing, but scrolls — at least in the climate of the Mediterranean — aren’t.

Avco Cartrivision

It’s common, in these sorts of articles, to recall the Betamax-versus-VHS format war of the 1980s and declare that VHS “won” by sucking more and costing less, thereby making it both the more successful format and objectively the worse one. But for the purposes of this list, VHS doesn’t rate. Let’s talk about the Avco Cartrivision, which sucked in ways VHS could only dream about.

Image by Retrosunshine2006, Creative Commons CC0 1.0

Did it have a separate machine you could purchase and hook to a TV? Nope. The only units commercially available included an integrated television and the cheapest cost $1,350 in 1972 ($8,537 today). Cartrivision used a “skip field” system in which only one field (half a frame) of data was recorded to save space. This field was then repeated 3x. The end result of this system was blurry, jumpy video… and you could only watch each movie once.

Remember, this is 1972. There is no home video market. There are no tapes for sale. Avco partnered with retailers to deliver films from a 200-movie catalog. Once you got the movie, you could only watch it once. Avco consumer players, you see, lacked a rewind function. The movie could only be rewound by a special machine owned by the retailers.

You have to give Avco some credit, here. The company basically invented DIVX and Netflix while it was busy attempting to invent the VCR. Unfortunately, the entire experience cost nearly $10K in today’s money, and customers weren’t wild about the idea of only being able to watch a film once. The endeavor lasted barely a year before going out of business. After this, it was found that storing Cartrivision films in ambient humidity can lead them to decay and rot to uselessness in a matter of months.

Zip Disks

The 3.5-inch floppy was wildly successful through the 80s and 90s, and many attempts were made to replace the format. The Floptical, HiFD, SuperDisk, and the UHD144 were all high-capacity diskettes in roughly the same form factor as the 3.5-inch floppy, but Iomega’s Zip disk was by far the most well-known competitor.

The initial run of Zip disks could store roughly 100MB of data, nearly 70x the capacity of the standard 1.44MB floppy disk. That was nothing to sneeze at in the mid-90s, but at $20 per disk, the cost-per-megabyte just wasn’t small enough to gain proper traction across the entire tech industry.

Earlier “Zip-100” drives were incompatible with larger capacity disks, and the performance of older disks was spotty on newer hardware. Even worse, Zip drives weren’t even compatible with 3.5-inch disks, so compatibility issues ended up bogging down the format for its entire existence. Regardless of competition from other mediums, the market fragmentation within the format itself made Zip disks a hard sell to consumers.

Zip drives did have the benefit of a relatively fast (roughly 1MB/s) transfer rate, but the falling price of hard drives and writeable CDs in the late 1990s made the “superfloppy” format war obsolete. On top of that, Iomega was slapped with a class-action lawsuit in 1998 in response to widespread failures — the dreaded “click of death,” signaling that the drive’s read/write actuator could no longer read disks and would repeatedly return to its initial position to try again. Iomega eventually tried to revive the brand by releasing Zip-branded CD drives, but the company never fully recovered.

Most Sony Proprietary Standards

Sony has historically built a number of proprietary standards besides Betamax, most of which have existed for the sole purpose of generating more revenue for Sony rather than contributing to any useful technological progress. Sony Memory Stick, for example, debuted in 1998 and may have been industry-leading at announcement — I am not certain on this point — but just a few years later, CompactFlash and later SD Cards were doing everything that Memory Sticks could do.

For years, Sony wouldn’t budge. Devices like the PlayStation Portable and PlayStation Vita both used expensive media formats that raised the price of owning these products for no reason. Sony Memory Stick’s primary purpose was to capture more profits for Sony, which is why the company continued to use it in lieu of cheaper, consumer-friendly standards. The format had no particular reason to exist and offered no benefit over other formats like SD Card. Sony’s proprietary format insistence isn’t the reason that the Vita failed in-market, but it didn’t do the company any favors, either.

Dishonorable Mention: LaserDisc

LaserDiscs have earned themselves a qualified mention here. What distinguishes LD from other standards on the list is that it was often the best way to view a film in the years before DVD, despite the drawbacks. It also introduced a number of features that VHS players lacked. With a horizontal resolution of 425 lines compared with 240 lines for VHS, LaserDisc players delivered nearly DVD quality nearly 20 years before DVDs became widely available. LaserDiscs could store multiple audio tracks and handled both digital and analog formats, and unlike VHS, LDs wouldn’t degrade with use and offered instant rewind and fast forward, similar to CDs. The Wikipedia entry for LaserDisc notes that it took DVDs several years to actually exceed LD quality, despite being better on paper from the start.

Image by Wendell Oskay, Flickr, CC-BY 2.0

The reason LaserDisc is on this list at all is because of all the other sacrifices that had to be made to support cutting-edge image quality circa 1978. The discs themselves were the size of 33 RPM records (12 inches / ~300mm). They spun at 1800 RPM, which made playback slightly noisier than your typical record. Discs weighed over a half-pound each and only held, at most, 64 minutes of recording per side. Some players could flip discs automatically, but for years budget-priced models didn’t and some movies shipped on multiple discs anyway, necessitating a mid-play shift regardless.

What saves LaserDisc from being a straightforwardly “worst” technology is that if you were willing to pony up money for the player and the media, you really did get a better experience than you could have anywhere else outside a movie theater. My family didn’t own a LaserDisc player, but a longtime family friend did, and I was lucky enough to catch a few movies that way in the late 1990s. The quality, even on an average TV set of the era, was vastly better than VHS.

LaserDisc was less a horrible format than a deliberate choice some videophiles made, accepting inconvenience for a better quality at-home film than you could get through any other method at the time.

Feature image by KMJ, published under the GNU Free Documentation License.

Now Read:

July 14th 2021, 9:49 am

ET Deals: Over $1,100 Off Dell Vostro 7510 w/ Core i7 and RTX 3050 Ti, Apple 12.9-Inch 2021 iPad Pro

ExtremeTech

Dell’s newly released Vostro 7510 laptop was built for business but has a versatile feature set that just about anyone will enjoy. It has a fast Intel Core i7 processor that can multitask with ease and an Nvidia GeForce RTX 3050 Ti to accelerate image editing work or for playing games, and best of all the system is all sell for just $1,149.00.

Dell Vostro 7510 Intel Core i7-11800H 15.6-Inch 1080p Laptop w/ Nvidia GeForce RTX 3050 Ti GPU, 16GB DDR4 RAM and 1TB NVMe SSD ($1,149.00)

This high-end Vostro notebook was designed as a high quality work machine. It has powerful processing hardware including an octa-core processor and Nvidia GeForce RTX 3050 Ti GPU. This hardware enables the system to quickly run multiple applications and render content with ease, and it can run games quite well too. The system is encased in a durable aluminum chassis and weighs just 4.1 pounds, which excellent for the amount of hardware contained inside. For a limited time you can get one of these systems from Dell marked down from $2,284.29 to just $1,149.00.

Apple iPad Pro 12.9-Inch Tablet 2021 Version ($999.00)

Apple’s 2021 12.9-inch iPad Pro comes equipped with the company’s revolutionary M1 processing chip that includes eight CPU cores and eight GPU cores. The tablet also has a high-end display that’s perfect for watching videos, and it has built-in features such as a LiDAR scanner for use in immersive AR games. This tablet is now on sale marked down from $1,099.00 to just $999.00 from Amazon.

Dell Ultrasharp U4320Q 43-Inch 4K USB-C Monitor ($869.99)

Dell engineered this model to be a large display with tons of desktop real estate. This makes it easier to multitask on multiple windows at the same time. The display also has a high 4K resolution and a USB-C port that doubles as both a video connection and a charging port for compatible notebooks. Right now you can get this display from Dell marked down from $1,159.99 to just $869.99.

Eufy Anker RoboVac 11S Robot Vacuum ($149.99)

Eufy designed this slim Robovac with a vacuum capable of 1,300Pa of suction. This gives it the power it needs to help keep your home clean, and it can also last for up to 100 minutes on a single charge. Right now you can get one from Amazon marked down from $229.99 to just $149.99.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 13th 2021, 7:14 pm

Save Over 20 Percent on this Innovative Laser-Projection Keyboard

ExtremeTech

With keyboards, added adaptability is a real bonus. Especially in these times of working on the go and the added importance of portability in our personal and computing devices, having a keyboard that’s easy to carry around and deploy is a plus for anyone who works in front of a computer. While it may seem like the technology for keyboards has been slow to develop in comparison to other tech such as tablet computers or smartphones, there certainly is a range of new products out there that challenge the status quo when it comes to keyboards and typing.

Right now, you can get the innovative Serafim Keybo, the “World’s Most Advanced Projection Keyboard” for the reduced price of $84.99, a great discount of 22 percent off the full purchase price of $109. Change up your typing with a highly praised projection keyboard that’s as adaptable as you need it to be.

The Serafim Keybo uses innovative laser technology to project your keyboard onto any flat surface – and that can be a standard QWERTY keyboard or a musical piano keyboard. It uses advanced sensor technology to detect your finger movements and can connect to your compatible iOS or Android phone for easy typing or music playing. The device features a powerful 2000mAh battery with a battery life of up to 10 hours and has built-in support for various languages, including English, Spanish, French, German, Arabic, Chinese, and Japanese. The device also comes with a music app, which enables you to play piano, guitar, bass, or drums instantly using the laser projection keyboard, as well as software development kit (SDK) support for iOS and Android Apps. Among others, Tech CrunchMashableCNET Japan, and Product Hunt have sung the praises of the Serafim Keybo.

The adaptable Serafim Keybo is now on sale for the low price of $84.99, a significant saving of $24.01 off the full purchase price of $109.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 13th 2021, 7:14 pm

A Copy of Super Mario 64 Has Somehow Sold for $1.5 Million at Auction

ExtremeTech

With decades of video game history behind us, some of the most iconic games and hardware have become valuable collector’s items. Copies of classic NES games like Super Mario Bros. and the Legend of Zelda routinely pull in more than a hundred of thousand dollars at auction, but the newest record-holder for the most valuable video game is a bit of a head-scratcher. Last weekend, someone paid $1.5 million for a copy of Super Mario 64. 

There are plenty of grown adults reading this who probably never had an N64, but Super Mario 64 was one of the system’s most popular titles. It helped that the early consoles came bundled with the 1996 game, which was the first 3D Mario adventure. Its place in gaming history makes Super Mario 64 potentially valuable — a sealed copy sold last year for $38,400. That’s undeniably a lot of money for a single video game no one will ever take out of the box, but it was in mint condition, rated at a 9.4 A+ by the game experts at Wata. 

This newly sold copy isn’t some special printing or a rare early demo cartridge. It’s just in perfect condition. Wata graded the game at 9.8A++, which is apparently the best possible condition for a sealed N64 game. Anyone who was buying games in the 1990s will remember how prone to damage the N64 boxes were, so most sealed copies are in much worse shape. Wata says that it occasionally gets “case-packs” of unopened N64 games that were intended for retail sales. Even then, it says there are usually just one or two 9.8 ratings in the lot. A 9.8A++ is on a different level of rarity, which apparently justifies the exorbitant price. 

Many observers of the rapid increase in video game value have expressed healthy skepticism of the record-setting sale. Yes, this is probably the most pristine copy of Super Mario 64, but $1.5 million? Nintendo sold millions of copies of this game; same box design, same art, and (presumably) the same cartridge inside that no one will ever see. 

Heritage Auctions, which handled the sale of this game, has pushed back against the controversy. It says that it is common practice to vet interested parties before allowing them to place such an enormous bid. In this case, staff verified the winning party was qualified to pay $1,560,000. If a game that sold millions of copies can get this much, just imagine what’s waiting in the wings. This might be the first million-dollar video game, but it won’t be the last.

Now read:

July 13th 2021, 5:15 pm

If You Want to Get Excited About CPUs Again, Start Creating With One

ExtremeTech

Credit: John Burek

When I decided to teach myself something about video editing and AI-powered upscaling at the beginning of 2020, I hoped I’d learn something useful about editing and improving images and video. I didn’t expect the project to change my own perspective on high-performance computing in the process.

At some point over the last two decades, CPU performance got a bit, well, boring. This is partly because the market spent some six years in the same, static configuration with quad cores + SMT in the top mainstream consumer bracket, and with quad-cores, 2C/4T, and dual-core CPUs waterfalling down from there. Market segmentation stagnation, however, is only part of the story. Microsoft stopped trying to aggressively push the envelope with successive iterations of Windows. PC sales slowed down as purchases of mobile and tablet devices surged.

AMD’s competitive re-entry into the CPU market with Ryzen back in 2017 kicked off the most exciting period of performance uplift we’ve seen since the launch of the Athlon 64 nearly 15 years earlier. Intel responded to increased competition from AMD with its own lineup of improved CPUs, to the general benefit of anyone buying a machine. All credit to AMD for kicking off the increased period of competition, but getting excited about faster, better CPUs occasionally requires more than just the CPUs themselves. It requires a workload. The truth is, gaming isn’t really that workload any longer. Not, at least, as reliably as it once was.

Intel and AMD will fight for gamers until the stars grow cold because gamers are an attractive market. Not only do they tend to have higher disposable incomes, but they’re also often willing to pay a premium for every last scrap of performance. There will always be a few PC titles that drive gaming fidelity and overall performance, but modern PC gaming does not require much in the way of high-performance hardware. The top games currently on Steam are:

Counter Strike: Global Offensive
DOTA 2
Rust
Team Fortress 2
Apex Legends
Dead by Daylight
Destiny 2
Player Unknown’s Battlegrounds
GTA V
FFXIV Online
Ark: Survival Evolved

Ten years ago, on July 11 2011, the ten most-played games on Steam were:

Team Fortress 2
Counter-Strike
Counter-Strike: Source
Football Manager 2011
Call of Duty: Modern Warfare 2
Terraria
Call of Duty Black Ops Multiplayer
Civilization V
Left 4 Dead 2
Battlefield: Bad Company 2
Gary’s Mod
Portal 2

If you look at when these titles were collectively launched, something jumps out. In 2021, the average “most popular” game today was launched in 2014. This holds true even if we use the later 2013 launch date for Final Fantasy XIV: A Realm Reborn.  Compare this with 2011, when the average PC game in the Top 11 was only four years old. The average age of the games people play has nearly doubled over the past 10 years. There’s nothing necessarily wrong with that. Some games are huge partly because they run on a wide range of hardware. But it speaks to the idea that gaming on the PC isn’t necessarily the place to push CPU performance if you want to feel like having a new machine really matters.

Multimedia transcoding, editing, and upscaling, on the other hand, will quickly make you hunt under the couch cushions for some spare change. This doesn’t have to be entirely separate from gaming — streamers often record and edit video clips together, or combine clips with specific voice-overs. Live streaming is a bit different from creating and uploading non-live content, but both types of systems benefit from additional horsepower.

The advent of new consoles and technologies like ray tracing may give game developers more reason to push the envelope, but today, content creation pushes PC hardware more reliably on the whole than gaming necessarily does. You may not notice much difference between a high-clocked Intel 8-10 core and an HEDT CPU in many cases, or between a Ryzen 5600X and a 5950X in high-end gaming. Ask your CPU to process multiple video streams simultaneously, and the difference between a quad-core and an eight-core (or between two different 64-core CPUs) becomes much more visible, much more quickly.

It’s satisfying to discover that one’s existing hobbies will be improved by future technology as opposed to being forced to seek out new life and new civilizations new ones. I’m a huge fan of OLED displays and HDR (native is best, but well-simulated isn’t bad) precisely because they deliver improved visual experiences in existing content. But we also live in what is arguably a golden era for content creation — and I don’t just mean the amount of material being created. Smartphones have turned us all into photographers, even if most of us aren’t good at it. Drones offer even novice pilots the opportunity to see the world from a new and different angle.

I’ve spent 16 months working on upscaling Star Trek: Deep Space Nine. If that’s not to your taste, there are restorations of Grateful Dead concerts, Chrono Cross, the Wrath of the Lich King cinematic, or GTA V. Curious about the Star Wars 1977 LaserDisc version? Here you go:

Here’s a person working on Doctor Who. Here’s a general video on restoring old videos in general. Play through the recent Mass Effect Legendary Edition, and you’ll quickly spot where texture upscaling was used because the technique has been used everywhere. Need some VHS tapes worked on? There are projects that focus on this, too. Even fans of the old cartoon Reboot can get in on the action.

If you want to fall back in love with performance for reasons that have nothing to do with gaming — for reasons that encompass gaming, but ultimately stretch beyond it — consider the wealth of content creation and improvement tools that exist today. Sometimes, the difference between feeling the itch to upgrade and not — assuming one is not in the middle of a pandemic-driven silicon shortage, at any rate — is what you’re doing with the machine you own already.

My ability to speak to the entirety of the video and audio editing market is constrained by the number of applications I’m personally familiar with, but in the course of this project to remaster and upscale Deep Space Nine, I’ve taught myself how to use AviSynth and Topaz Video Enhance AI. I’m learning DaVinci Studio Resolve and Photoshop. Our EiC, Jamie Lendino, has worked professionally in audio engineering and editing. My colleague David Cardinal is a professional photographer and videographer.

Many of these workloads task the GPU, but they task the CPU as well. I often find myself pre-processing in AviSynth while simultaneously upscaling in Topaz. Doing both of them simultaneously requires a fair bit of horsepower, especially if I pre-process more than one piece of content simultaneously. For the first time in years, I find myself meaningfully caring about application performance improvements at both the hardware and software levels. It puts a different spin on a launch like Ampere or RDNA2 when you’re looking forward to it for reasons related to both gaming and application software. If I’m being honest, it makes computing more fun. I get more use out of the silicon I own than I ever have before.

All of this is my own personal experience, of course, so your mileage may vary. But if you used to love computing and find yourself a bit ‘meh’ on things these days for a reason unrelated to the current cost of hardware, consider trying to do something new. You may be surprised at how much it changes your enjoyment of your own hobby — or your reasons for upgrading.

Top image credit: John Burek

Now Read:

July 13th 2021, 5:15 pm

TSMC Mulls On-Chip Water-Cooling for Future High-Performance Silicon

ExtremeTech

Every few years, a major microprocessor manufacturer or research institution dives into the world of radical CPU water-cooling. TSMC recently gave its own presentation on the topic, in which it explored three different methods of potentially cooling a chip with on-die water cooling.

Companies and organizations keep returning to this idea because an integrated, affordable on-die water cooling system could solve a lot of other problems in advanced chip manufacturing. AMD is working on a version of its Zen 3 CPU core with 128MB of integrated L3 cache, but the company had to carefully position the cache chips to avoid causing hot spot problems within the Ryzen die. Vertical die stacking is supposed to drive semiconductor density throughout the 2020s, continuing the overall trend of density improvements we shorthand as “Moore’s Law.” Incidentally, this also allows semiconductor foundries to talk up the idea of extending Moore’s Law, despite the fact that the titular law says nothing about 3D chip stacking, as such.

The industry’s ability to stack chips on top of each other, however, is directly proportional to its ability to keep the silicon stack from roasting in its own heat. Nvidia’s A100 accelerator is specced for 500W TDPs, while Intel’s Ponte Vecchio has a 600W TDP. Manufacturers and designers continue to push the envelope, and some types of on-package (or on-die) water cooling would need to be at least partially integrated by the manufacturer. The specifics here depend on the type of solution considered.

TSMC tested three different methods of creating fluid channels and three different methods of meshing cooling solution and Thermal Test Vehicle (TTV). The three types of fluid channels tested were square pillars, trenches, and a flat plane. The three types of cooler designs TSMC tested were direct-water cooling, a cooler with a silicon-oxide thermal interface material (TIM), and a cooler with a liquid metal TIM.

In Direct Water Cooling, water channels were etched directly into the silicon layer on top of the CPU. In the second test, silicon channels were etched into a silicon layer with a silicon-oxide thermal interface material between the microfluidic system and the actual silicon of the TTV. In the third option, the silicon-oxide TIM was replaced with a liquid metal TIM. TSMC’s data showed that the square pillar design outperformed the other two approaches, so we’ll focus on the figures reported for that solution:

According to TSMC’s test data, their cooler design was able to dissipate 2.6kW of heat at maximum (with a 5.8L/minute flow rate) and a temperature delta of 63C. Direct water cooling performed the best, followed by the silicon oxide TIM. Even the liquid metal TIM, however, was capable of dissipating 1.8kW of heat. That’s far more efficient than anything available today, though obviously, this is a thermal test vehicle/proof of concept, not a final product.

It’s interesting to think of the kind of products that might require that sort of cooling in the future. There is little chance that enthusiast computing would ever reach such lofty heights. A 15-amp circuit at 120 volts can provide a nominal 1800 watts. Enthusiast GPUs may continue to hit higher TDPs — I can’t predict what AMD and Nvidia will do on that front — but we’re a long way from 1kW GPUs, to say nothing of 2.6kW.

The kind of cooling performance improvement that TSMC could potentially deliver would allow for higher clock speeds than anything we’ve seen to date, but not, I suspect, nearly as much as we might want. Silicon simply does not scale well past 5GHz, and manufacturers might not spend much effort trying to push raw performance that way. What’s more interesting is the kind of density improvements this kind of cooling might enable. While cooling hardware requires its own infrastructure, a system capable of dissipating up to 2.6kW of heat could cool much more hardware in a much smaller space than current server deployments.

A cooling system that can dissipate 2.6kW of heat is overkill in any reasonable consumer product, but not necessarily in server or data center systems of the future, especially if more and more processing continues to move to the cloud. Thus far, these sorts of cooling solutions haven’t been commercialized because manufacturers haven’t been forced to adopt such expensive methods of boosting performance to continue offering hardware performance improvements. Hardware TDPs are scarcely going down, however, and the shift towards 3D chip stacking may require a radical rethink of existing cooling strategies in the long term.

Credit: TSMC

Now Read:

July 13th 2021, 5:15 pm

Patching DRM Out of Resident Evil Village Boosts Performance

ExtremeTech

The use of DRM (Digital Rights Management) technology to prevent the copying of PC games has been controversial since its invention several decades back, but the conversation takes on an added dimension when included DRM starts harming game performance. This has happened with several Denuvo releases in the past, and it appears to have been a problem in Resident Evil Village — though this was not, apparently, solely Denuvo’s fault.

Empress, the cracking group behind the most recent hack, has released an updated version of Resident Evil Village that strips out both Capcom’s implementation of Denuvo and the company’s own V3 DRM system. Apparently, the combination of these two capabilities was causing significant slowdowns and lag in-game, including every time the player killed a zombie. For those who have not previously played a Resident Evil game, killing a zombie is on par with Mario stomping on a Goomba, as far as its importance to the core gameplay loop.

The NFO file states: “All in-game shutters [sic] like the one from when you kill a zombie are fixed because Capcom DRM’s enty [sic] points are patched out so much of their functions are never executed anymore. This results in a much smoother game experience. THIS IS PURE CANCER AND ANYONE WHO ACCEPTS THIS IS NOTHING BUT A PATHETIC GARBAGE HUMAN SLAVE.” (Emphasis original):

Empress certainly has some opinions on the topic.

The question of whether Denuvo slows down video games is complicated. Denuvo, unsurprisingly, insists that it does not. Other investigations have shown that in some titles, Denuvo DRM implementations can impact performance. Much appears to come down to how Denuvo or any other form of DRM is implemented. Multiple reviews of the PC version of Resident Evil Village at launch note that it offers poor performance compared with the console versions. Tests by DSOGaming confirm that killing zombies is now lag-free, and while we haven’t played the original version of the game to compare, widespread online feedback from multiple players and repeated requests for improvements demonstrate that this has been a genuine problem.

This video showcases multiple odd stutters both in and out of combat on what ought to be a top-tier PC. Similar stutters are not visible in the DSOGaming video. embedded below.

Viewers should be aware that stuttering in video games can happen for a complex set of reasons related to your underlying storage, CPU, or GPU. The complaints about Resident Evil Village, however, allege that these stutters and performance drops remain even when game detail settings are lowered to minimal values. This is less common and is more likely to indicate an underlying issue with some aspect of the game.

While ET takes no position on whether being willing to play Resident Evil Village in its present state qualifies you as a “pathetic garbage human slave,” we’re a bit more open to the idea that shipping the game in this kind of state says nothing good about Capcom itself.

At best, DRM is an agreement between reasonable people, in which the buyer agrees that handing out copies of a game to whoever wants one will, at some point, have an impact on sales of the title. In return for this acknowledgment, the creator/distributor of said title puts some restrictions on its use. Many lightweight restrictions aren’t a practical impediment to enjoying a game, especially if there’s a robust “Play offline” option included.

I am sympathetic to the needs of content creators who depend on legitimate game sales to make a living. I’m willing to acknowledge that some DRM is not particularly burdensome and puts only very mild restraints on how a game or application can be used. It may be that no DRM is good, but some DRM solutions are worse than others. There’s no sign that the industry is going to move away from DRM, but DRM that impacts game performance effectively lies to the player about what’s going on inside their PC.

One characteristic of a genuine performance bottleneck is that it improves when you throw faster hardware at it. Games that are merely badly optimized typically become playable over time as the raw performance of underlying hardware improves. Sometimes one can find performance improvements in odd places. Anyone frustrated by slow game loads in the original Dead Space on PC, for example, can improve this aspect of the title by unlocking the frame rate.

DRM-related problems do not always respond in the same way because they aren’t being caused by a weak spot in your own system’s hardware configuration. Lowering your PC settings won’t fix the problem. Even returning to a title after several years may not fix the problem, unless the offending DRM has been patched from the game.

Developers and publishers cannot claim to take the PC market seriously if they’re willing to also publish DRM schemes that fundamentally damage the PC player’s experience. Some players are willing to tolerate the hitches and stutters, but they shouldn’t have to. People shouldn’t need to consult an endless set of optimization videos or articles in a fruitless attempt to fix something that, in Capcom’s incorrect perception, isn’t broken enough to fix.

Feature image by Capcom

Now Read:

July 13th 2021, 5:15 pm

Intel May Build Chip Facilities Across Europe as Part of $20B Foundry Plan

ExtremeTech

Intel is still talking up the potential benefits of an EU partnership as it lobbies European nations to support a new leading-edge foundry facility. The company has pledged to spend $20B on a facility or facilities, with a lifetime investment of up to $100 billion in the project over several decades. This is not necessarily unusual — foundries are expensive to build and most companies upgrade them multiple times — but it speaks to Intel’s long-term willingness to invest in a plant.

According to the Financial Times, Intel executives are willing to build multiple facilities in several member states to secure funding and support. “We could put manufacturing on one site and packaging on another,” Greg Slater, Intel vice-president of global regulatory affairs, part of the team exploring possibilities for expansion in Europe, told FT. “We are well placed to make this an ecosystem-wide project, not just a couple of isolated paths in one member state,” he said. “We do believe that this is a project that will benefit Europe at large.”

Intel has explored the idea of building a fab in Germany, the Netherlands, Belgium, and Italy thus far, on a site of roughly 1,000 acres. The company would initially build a pair of fabs on the same site at roughly $20 billion, or $10 billion per fab. The estimated facility lifetime spending is around $100 billion.

Most of the companies we talk about at ET build on the leading edge. Intel, Samsung, and TSMC dominate chip revenue. Owning foundries can be quite lucrative if you have the scale to do it. Image by Bloomberg.

There are two significant unanswered questions regarding such a facility: Would it be a leading-edge fab and to what degree would it be subsidized? According to French officials, there’s a 30-40 percent cost gap between building a fab in the US versus in Asia, and much of that difference is due to government support. According to both Intel and the French government, the question of which customers would be likely to use the foundry and where those customers are located geographically would also play a part in the fab’s final location. Semiconductors circle the globe multiple times on their manufacturing journey, but Intel and the EU may be looking to reduce global dependencies and the impact these can have on semiconductor manufacturing.

Most of the semiconductor customers in Europe are companies that are off the leading edge. Intel sounds at least theoretically open to the idea of building a plant to focus on legacy technology or nodes rather than insisting on a leading-edge facility as the only kind of factory worth building. Building foundries focused on the needs of non-leading-edge foundries wouldn’t be as sexy as a leading-edge deployment, but it will probably be easier to bring up the facility and start cranking out wafers. Intel’s current plan for a $20B investment in Europe is separate from its $7B expansion of the Leixlip, Ireland facility, which will eventually transition to 7nm and build leading-edge chips whether Intel pursues larger fabs in mainland Europe or not.

A new facility in Europe with strong European support would help anchor Intel’s Foundry 2.0 plan and position the company as an alternative to TSMC. The Taiwanese foundry currently handles most of the world’s contract manufacturing, but Intel wants to change that in the future. Of the top three manufacturers shown in the diagram above, Intel and Samsung both produce large amounts of their own equipment. TSMC is the only pure-play foundry among the three, and it dominates chip production for every company that doesn’t own its own fabs. The two firms have already tangled with each other over the need for a European foundry in the first place, with TSMC downplaying any such effort and Intel offering the idea of new semiconductor facilities — with some hefty subsidies attached. What sort of subsidies will undoubtedly depend at least in part on whether the final plan calls for a leading-edge EUV semiconductor plant or a trailing-edge facility designed for older nodes and/or planar silicon.

Now Read:

July 13th 2021, 9:29 am

An Earth-Like Axial Tilt Might Be Necessary for Complex Life to Arise

ExtremeTech

Earth has numerous properties that make it an ideal home for life as we know it, including a robust magnetic field that deflects radiation, a temperate climate with liquid water, a large moon that stabilizes the planet’s rotation, and a modest axial tilt. That last item may be more important than we previously thought, according to a new study funded by NASA. The study suggests that a tilted axis leads to more oxygen production, and that means more complex life. 

Scientists have been increasingly interested in what makes Earth a haven for life as more and more exoplanets are detected in the sky. If we’re going to go looking for alien life, which are the best exoplanets to investigate? Sure, those that have liquid water and aren’t being fried by radiation, but the role of oxygen cannot be overstated. All the complex life on Earth needs oxygen to survive. 

After you breathe in O2, it helps your cells produce most of the energy they need to operate. In the earliest days of life on Earth, there was very little oxygen, and life was very simple. However, the explosion of photosynthetic organisms boosted oxygen levels and made multicellular life viable. The study, led by Stephanie Olson of Purdue University, used a sophisticated model of Earth to tease out what properties are most vital to fostering the formation of life. It turns out, axial tilt could have a huge impact. 

They found, as predicted, that increasing day length, higher surface pressure, and ocean nutrient circulation all helped to boost oxygen in the atmosphere. The effects of axial tilt were unexpected, though. Greater tilting increased photosynthetic oxygen generation. Earth’s 23.5-degree tilt is what gives us our seasons, and that shifting temperature appears to drive more photosynthesis in oceans. Adding an Earth-like tilt could have the same effect as doubling the volume of nutrients in the oceans. 

According to the authors, a small tilt like Mercury (2 degrees) or an extreme one like Neptune (98 degrees) could lead to lower oxygen production, and therefore, fewer opportunities for complex life to develop. Of course, this assumes the biochemistry on other planets is similar to Earth. All we can really say for certain is that oxygen is a very good electron acceptor, and that makes it valuable metabolically. That should be true everywhere, so more oxygen means more complex life. Probably. That’s not to say complex life has not arisen in an anaerobic environment that relies on other types of chemistry. This just gives us a place to start.

Now read:

July 13th 2021, 9:29 am

This SQL for Data Science Masterclass is Now just $19.99

ExtremeTech

Data science is an increasingly important career path to many industries, and data scientists are in demand. Making sense of the masses of data available – whether it’s product performance, market reports, database administration, or commercial data – helps companies to improve what they do, raise profits, and react more quickly, efficiently, and effectively to market changes. Managing and processing the data available to a business is an invaluable role, and data scientists frequently make use of tools such as structured query language (SQL) to manage the huge amounts of data available. Any data analyst or software engineer would do well to master and review their knowledge of such tools in order to remain on the front foot in the industry.

Right now, The Complete 2021 SQL Master Class Bundle is on sale for $19.99, a great discount off the full package purchase price of $1,393. Learn or solidify valuable data science skills with this quality e-learning package.

The bundle includes eight essential e-learning courses across 308 easily digestible lessons spanning 23.5 hours of data science content. Start with the SQL 101 course and learn to write queries in the SQL language, and proceed to more complex courses, such as SWL for servers, defining data with SWL, learning about NodeJS, and mastering SQL database queries – with each course building on the knowledge you’ve already taken on in previous courses.

The courses are taught by skilled instructors at SkillSuccess, which currently provides more than 2,000 curated video courses on a range of topics, and offers support via email and live chat.

The Complete 2021 SQL Master Class Bundle is on sale for $19.99, a wonderful discount when you consider that this brings the price of each included course to less than $2.50, while each contains e-learning content to the value of $199.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 12th 2021, 6:29 pm

ET Deals: Dell Alienware M15 R4 Intel Core i7 Nvidia RTX 3060 144Hz Gaming Laptop for $1,499, Lenovo

ExtremeTech

Gaming is at an all time high in terms of popularity right now with gaming hardware becoming increasingly hard to find. Today for just $1,499 you can buy one of Dell’s new Alienware M15 R4 that has all the hardware you need to run games with blistering speed and with high quality graphics settings.

Dell Alienware M15 R4 Intel Core i7-10870H 15.6-Inch 1080p 144Hz Gaming Laptop w/ Nvidia GeForce RTX 3060, 16GB DDR4 RAM and 512GB M.2 PCI-E SSD ($1,499.99)

If you want a fast notebook with plenty of performance for running the latest games, you may want to consider Dell’s Alienware M15 R4. This system was literally built for gaming and it features a fast eight-core processor, an Nvidia GeForce RTX 3060 GPU, and a high-quality 1080p that can operate at a fast 144Hz. The system also has a 512GB NVMe SSD to help it boot quickly. You can get this system from Dell marked down from $1,849.99 to just $1,499.99.

Lenovo Flex 5 AMD Ryzen 5 4500U 14-Inch 1080p 2-in-1 Touchscreen Laptop w/ AMD Radeon Graphics, 16GB DDR4 RAM and 256GB SSD ($600.48)

Lenovo’s Flex 5 laptop was designed as a 2-in-1, which means it can rotate the screen around and be used similar to a tablet. It has a 1080p touchscreen display and it’s driven by an AMD Ryzen 5 4500U processor with six CPU cores and a relatively strong iGPU. This gives the system strong performance for everyday tasks and enough processing power to run some games with lower graphics settings. Currently you can get this laptop marked down from $649.99 to $600.48 from Amazon.

Fitbit Charge 4 ($98.95)

The new Fitbit 4 features new built-in GPS and SPO2 sensors along with Spotify controls and a battery that can last for up to seven days. Currently you can get this helpful fitness device from Amazon marked down from $149.95 to just $98.95.

Arlo AVD1001 Video Doorbell 1536×1536 Wired ($114.77)

Arlo designed this doorbell to be a highly capable home security device. The high quality 1,536×1,536 camera is equipped with night vision for seeing in the dark, and it has a built-in siren as well as a motion detector. Currently you can get one of these devices from Amazon marked down from $149.99 to just $114.77.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 12th 2021, 6:29 pm

Virgin Galactic Founder Makes It to Space Aboard His VSS Unity Spacecraft

ExtremeTech

The billionaire space race is over, and the winner is Virgin Galactic founder Richard Branson. The flamboyant 70-year-old billionaire boarded one of his SpaceShipTwo-class vessels on Sunday, July 11th at Spaceport America in New Mexico. With the help of a carrier aircraft, the VSS Unity rocketed to the edge of space, giving Branson and his fellow passengers a few precious minutes of weightlessness before their safe return to Earth. This marked the first fully crewed flight for Virgin. 

In some ways, this is just another rich person doing something almost no one else can afford to do. Tickets on Virgin Galactic flights are going to cost hundreds of thousands of dollars, but that’s still a big discount compared with what it used to cost to get into space as a tourist. “If we can do this, imagine what you can do,” Branson said in a recorded live stream during the sub-orbital space voyage. 

Unlike Jeff Bezos’ New Shepard and Elon Musk’s Crew Dragon, SpaceShipTwo does not launch like a rocket. It saves its rocket engine for several minutes after liftoff, which is accomplished with the help of a WhiteKnightTwo-class carrier known as VMS Eve. This jet hauls the space plane up to a normal cruising altitude before detaching and returning to Earth. At that point, VSS Unity used its single rocket engine to fly up to an altitude of 282,773 feet (53.5 miles). 

Following the flight and successful landing, Branson stood before a cheering crowd to declare a new era in spaceflight. A keen observer will note the VSS Unity’s apogee was below the internationally recognized Kármán line (62 miles) that marks the start of space. However, the US uses a slightly lower 50-mile altitude to delineate Earth from not Earth. So by that metric, Branson and his fellow passengers made it to space. 

There were no paying passengers on this trip. The entire passenger manifest of Unity 22 consisted of Virgin employees including Beth Moses, Colin Bennett, and Sirisha Bandla. Pilots Dave Mackay and Michael Masucci were at the controls, as they have been in several past flights. 

Virgin Galactic started taking reservations for seats a decade ago, and more than 600 very wealthy people have plunked down as much as $250,000 for a ticket to space. They’ll finally be able to use those tickets soon. As for everyone else, you can register your interest now, and Virgin Galactic will reach out when sales are open again. Start saving your pennies now, though. That $250,000 ticket price might go up. It will be a while before an average Joe can book a trip to space, but SpaceX and Blue Origin will probably cost even more than Virgin. At least you get a free flight suit with your trip.

Now read:

July 12th 2021, 4:16 pm

ExtremeTech Is Hiring!

ExtremeTech

ExtremeTech is hiring freelance writers, each for five stories per week. You would find, write, and produce these posts yourself, so experience writing tech news and working in a CMS is a must. You enjoy writing, love to research, have a passion for technology, and are committed to a deeper understanding of what’s happening in the world around you — as well as what’s on the horizon.

You have strong news judgment and technical chops, and can not only identify the most important stories of the day but provide some level of context or analysis. We don’t “re-news” on ExtremeTech; every one of our stories adds something to the conversation. Bring your specific tech passions.

Competitive, negotiable rate per story (400 to 450 words each).

REQUIREMENTS:

* Experience writing consumer tech news for a digital publication

* Ability to write clearly and quickly

* Comfortable producing stories in a CMS

* Effective communicator

Interested? Email a cover letter, resume, and writing sample to jamie_lendino@extremetech.com.

July 12th 2021, 4:16 pm

How to Upscale Video to 4K, 8K, and Beyond

ExtremeTech

Over the last five years, video and image upscaling — the process of turning a lower-resolution video or photo into a higher-resolution image — has gone from the realm of research papers and tech demos to commercial products. In addition to invalidating several decades worth of nerd complaints about how looking at the screen and saying “Enhance” does absolutely nothing, it’s now possible to upscale video dramatically and improve overall quality in the process.

Upscaling technology is now built into a variety of consumer devices you can purchase and there are software packages available for video editing as well. The word “upscale” generically means “to improve the value or quality of something.” In the video and PC space, it’s almost always a reference to increasing the effective resolution of a piece of content. Some AI upscaling approaches work in real-time and some do not.

There is a difference between upscaling and what we call native resolution. Upscaling a 1080p image into 4K means taking an image originally encoded in 1920×1080 and algorithmically transforming it into what a higher-resolution version of the same image ought to look like. This upscaled image will not be identical to a native 4K signal, but it should offer a better picture than what was previously available on your 720p or 1080p television.

Keyword: “should.” Video scalar quality in TVs can vary widely between different product families. In some cases, you might be better off using a GPU to drive a picture than relying on the TV’s native rescaling capability, while other TVs have excellent upscalers. Manufacturers rarely disclose their upscaling hardware choices, but higher-end TVs should have improved upscaling capabilities. If you have a UHD Blu-ray player paired with an older or lower-quality 1080p or 4K TV, you might even get better results by running all video signals through the Blu-ray player rather than the television. Generally speaking, a good TV upscaler is considered to be as good or better than a GPU.

How the Scalar Sausage Gets Made

The most basic function of a video scaler is to take whatever image it receives — 480i, 720p, 1080p — and stretch it across the entire screen. Without this functionality, a 1080p signal would take up just a fraction of a 4K television’s display. This simple resizing is typically done by taking each individual 1080p pixel and creating four pixels out of it (remember, 4K is four times the pixels of 1080p).

But many TVs and Blu-ray players do more than just perform a simple 1:4 mapping. They also use video processing techniques to extrapolate what details ought to be present in the scene. How well this works depends on the type of content being upscaled.

Image by DisplayNinja

In the image above, you can see how the upscaled 4K is much more nuanced than the simple 1:4 mapping in the second grid from the left. If you’re having trouble seeing the difference between the 1080p upscale and the native 4K, look at the left-side blocks in the very first row and the right-side blocks in the very last row. The native 4K image resolves into distinctly different colors than the 1080p image in R1C2 (That’s 1st row, 2nd column), R1C3, and R8C8. As the amount of available horsepower in televisions has improved, the quality and sophistication of their integrated upscalers have grown as well. Some modern TVs have sophisticated sharpening algorithms to reverse the blur created by upscaling and interpolation algorithms good enough to almost match the precision of a native 4K signal.

How does all this look with real content? A 2017 Rtings article can help answer that question. The first image below is from a 4K set displaying 1080p in upscaled 4K, while the second is native 1080p.

Image by Rtings

If you have trouble seeing a difference between the two images, open both of them in a new tab and focus your eyes near the center of the image. See the house with a brown roof near the center of the image, with three windows facing approximately south-southwest and a fourth pointed east? (All directional cues based on north being “up”, not the direction of the sunlight). Look at that specific spot in both images, and the roofs in the buildings immediately adjacent. The difference should jump out at you. In this case, even using TVs that date back to 2015, the 1080p upscale to 4K is better than the 1080p image.

Image by Rtings

If you have an older TV or a budget 4K model, there’s one obvious method of improving your TV’s upscaling: Buy a better television. Unfortunately, it’s impossible to predict how well this will work without knowing exactly what you own now and what you plan to purchase to replace it. The older your current TV, the better the chances that a new set will deliver upgrades in all respects, but many of those improvements may have nothing to do with the way your upscaler handles <4K content.

If you aren’t happy with your current TV, can’t replace it at the moment, and happen to own a high-end Blu-ray or UHD player, you can also try running content through its upscaler rather than relying on the television to handle it. In some cases, a top-end UHD Blu-ray player may deliver a better experience than an entry-level 4K TV from a few years back. If you’re still using a DVD player to feed a picture to a mediocre 1080p or 4K panel when/if you play DVDs, and you can swap over to a high-end Blu-ray/UHD Blu-ray player instead, I’d try it. It may or may not help, but it definitely won’t hurt. What you’re trying to do here is route the signal through the upscaler that’ll give it the best quality kick.

Still need a higher-quality picture? You’re in luck.

Real-Time AI Processing

I haven’t tested the most recent Nvidia Shield myself, but there’s a demo you can actually play with on Nvidia.com to apply the effect the TV offers. Here’s a screenshot of the effect. I’ve positioned the slider over the lizard’s eye because it’s the easiest place to see the upscaler’s impact:

Left: Regular upscale
Right: Nvidia Shield

Still not clear? Here’s an enlarged version of the same screenshot.

The image on the left is a traditional upscaler, the image on the right is Nvidia’s Shield when asked to scale up 720p or 1080p content to 4K (content below 720p is not supported for upscaling, at least not yet). The AI component of the upscaler obviously improves the overall image quality. In my experience with applications like TVEAI, this is a fair representation of the improvements that can be achieved.

Third-party reviews of the Shield agree. Slashgear writes that when it works, the effect is “fairly astonishing.” So far as I’m aware, the Shield is currently the only set-top or 2D box offering this kind of functionality.

Preprocessing Video

This step is technically optional because there’s no reason you can’t take a DVD source file and feed it directly through upscaling software (more on this below). Often, however, this is not an ideal way to extract maximum quality from your source. It is possible to achieve meaningful improvements in image quality with traditional video editing tools. The first image below is from Deep Space Nine episode 6×06, “Sacrifice of Angels:”


Here’s the same image after being processed but before upscaling. I’ve applied denoising filters and a sharpening filter, respectively. There’s a careful balance to be struck with some of these techniques, between blending away features and drawing out detail.

Here’s a slider for a direct comparison between the two. Both frames have been resized to 2560×1920 to make it easier to see the differences between them.

Video Upscaling Via Third-Party Software

For best results, you can use a third-party upscaler, such as Topaz Video Enhance AI. I’ve made extensive use of Video Enhance AI as part of the Deep Space Nine Upscale Project and can confirm that the application is capable of yielding stunning results. The software currently supports upscaling on AMD and Nvidia GPUs, as well as Intel integrated solutions. AMD APUs are also supported.

The reason we can now “enhance” images to improve their clarity is that AI-based applications are capable of analyzing a video field and estimating what detail would exist if the image were of higher quality already.

One of the typical ways to train a video-enhancing application such as Video Enhance AI is to provide the neural net with the same image or video in both high and low quality. The neural net is then tasked with finding a way to make the low-quality source look as much like the high-quality source as possible. Instead of trying to teach a computer what lines and curves look like by providing painstaking examples, we’ve developed the ability to make the computer do the work of teaching itself. Saying “Enhance” has gone from a complete joke to a plausible reality in a matter of a few years.

Applications like Topaz Video Enhance AI can cost several hundred dollars and they don’t run in real-time. The result of using these applications, however, is a vastly better picture than you’ll see from any other source.

And, again, a slider for comparison. This time, QTGMC pre-processed output is on the left, and the upscale is on the right. If you’d like to see more examples of what Topaz VEAI is capable of, I’ve been experimenting on Star Trek: Voyager as well, as shown in the slideshow below:

Topaz supports multiple models to handle various kinds of video, from models designed for deinterlacing (though not reverse telecine), models designed for video that’s been damaged in various ways, and models that can be tuned to your video depending on your needs in that specific clip. With a little color grading, a fairly dramatic overall improvement can be achieved. This slider shows a base frame on one side and the final output, with color grading, on the other.

I suspect we’ll see AI video processing and upscaling become more important over time as Intel, Nvidia, and AMD introduce their next-generation graphics processors. There’s an awful lot of old content orphaned on poor-quality source, and building specialized AI processors to fix it will likely consume some silicon for all of the concerned parties over the next few years. The latest version of the Deep Space Nine credits that I’ve created is shown below:

Upscalers are amazing and only getting better. That’s true no matter how you consume content or what technology you use to do it. Depending on the shows you like and how much time you want to sink into the project, there are tremendous improvements to be had… or you can just wait a few years, and buy a better TV. You can read more about DS9UP at the links below.

Feature image on top shows the impact of upscaling as well as color correction via Photoshop.

Now Read:

July 12th 2021, 12:30 pm

We Might Be Able to Deflect an Asteroid by Sacrificing Satellites

ExtremeTech

Credit: NASA

Some days, it feels like humanity is doing everything it can to hasten the end of the world, from creating weapons that can turn Earth into a nuclear hellscape to pumping formerly sequestered carbon into the atmosphere at ever-increasing rates. Still, these are things we as a species could stop doing if we wanted. We can’t just decide not to get wiped out by an asteroid. Stopping a big space rock from colliding with Earth would be a tall order with our current level of technology, but scientists now believe telecom satellites could help avert doomsday if an asteroid has Earth in its sights. 

Ideally, astronomers would see an asteroid collision coming many years in advance, giving us time to develop a strategy for deflecting it. What happens if there isn’t much time, though? Assuming Bruce Willis is unavailable, a few big satellites might do. A team of scientists from Airbus has developed a mission concept for the European Space Agency (ESA) called Fast Kinetic Deflection (FastKD). The gist is that we run a few big telecom satellites into the approaching space rock, and hopefully, that knocks it off course. 

The FastKD study used a 1,000-foot asteroid as an example. An object of this size would cause widespread destruction if it collided with Earth, but it’s still small enough that we might be able to nudge it out of the way. Large telecom satellites usually sit in high geostationary orbits. With a boost in the right direction, such a satellite could be placed on a collision course with an approaching asteroid where its four-to-six-ton bulk could help push it clear of Earth. The study estimates that it would take ten such collisions to deflect the theoretical 1,000-foot impactor. 

But why use telecom satellites at all? FastKD imagines a troubling scenario in which we only have a few years of warning. A deflection mission might take six to 18 months to reach its target, leaving humanity very little time to do the necessary engineering to get a dedicated asteroid deflector into orbit. Telecom satellites, on the other hand, are big, heavy, and always available. In 2019, there were 15 suitable satellites in orbit. 

This is all still speculation — having only examined a few asteroids up close, we don’t know how they would behave when hit with an impactor. There’s evidence that many asteroids could be more like floating rubble piles, which would make deflection much more difficult. We should have a better handle on our options next year when the upcoming Double Asteroid Redirection Test (DART) spacecraft (above) approaches the double asteroid Didymos. DART will launch a 1,100-pound impactor at the smaller of the two rocks (sometimes called Didymoon) to see if the 500-foot object changes trajectory. This could serve as validation or refutation of the FastKD plan.

Now read:

July 12th 2021, 9:31 am

Even Backup Storage Provider Backblaze Can’t Make Money Farming Chia

ExtremeTech

The Backblaze 4U storage pods can save up to 480 TB of user data.

When the storage-based cryptocurrency Chia came out, hard drive prices shot up as a surge of demand took scarce product away from store shelves. SSDs turned out to be pretty minimally impacted — Chia plotting eats consumer drives like candy — but large-capacity HDD prices were definitely affected.

The bulk storage provider and backup company Backblaze has published its own evaluation of whether the company could make money mining Chia. We’ve seen companies take advantage of unique positioning to earn a lot more money via cryptocurrency than you’d typically expect. The Greenridge Power Plant near Seneca Lake, NY is currently using 87 percent of its power-generating capacity to mine cryptocurrency and making a great deal of money in the process. Backblaze doesn’t have the advantage of self-generated power, but the company is a storage provider experimenting with a storage-based cryptocurrency. Surely this is an easy win?

According to Backblaze, no. In fact, the company’s estimates suggest making money with Chia is quite difficult. Backblaze initially decided to get mining up and running in the cloud as a way to provide a service to its customer base. Once it had some expertise under its belt, it turned its attention to the larger question of whether a storage provider like itself could make money in such an endeavor.

Backblaze ran into two problems. First, its business is largely HDD-based. That’s good for farming, but bad for plotting (plotting Chia is incredibly hard on SSDs, but requires them to work at any kind of speed). The cost of buying enough SSDs to keep things rolling represented a serious expense according to the company. This, however, was not the most serious problem.

Backblaze’s estimates indicate that even using 150PB of its own storage buffer is not enough capacity to make money on Chia in the long term. The company assumed that it would allocate 150PB of its storage, that the value of Chia would remain constant, and that it would make $250,000 its first week. That’s actually quite good. The problem is what happens afterward:

Based on the rate at which the Chia network is growing (33 percent each week, according to Backblaze), even starting off with 150PB worth of HDDs is not enough to keep pace with the rapid expansion of the market. Earnings each week appear to be roughly 75 percent of the week before, decreasing asymptotically over time. This produces a curve with a farming income that drops to nearly zero well before the 20-week mark. Things get better when Backblaze assumes a constant growth rate, but not by much.

In this model, the long tail of earnings doesn’t leave Backblaze earning anything after 16 weeks, but the company still earns the overwhelming percentage of its revenue within the first three months. This seems to be the case for individuals as well; several people I know who launched Chia mining farms when the cryptocurrency was new have already retired from the market after their earnings plunged.

Once costs were factored into the equation, the math didn’t favor Chia mining. Under the exponential growth model, Backblaze’s theoretical 150PB farming pool would cease generating net revenue if it launched more than seven weeks from the time the company performed its initial analysis. If the steady growth model is accurate, costs exceed income “around week 28.”

Backblaze notes that it was never planning to hold Chia speculatively, so it made no sense for the company to mine under the assumption that coins might dramatically increase in value at some point in the future. The company still intends to offer the option to farm Chia in the cloud to its customers, because “we love watching people experiment with storage,” but it doesn’t mince words about the value of mining Chia. Backblaze writes: “as our analysis suggests, unless you can continue to grow your plots, there will come a time when it’s no longer profitable.” The company’s post also acknowledges concerns regarding the electricity consumption and hardware requirements of various cryptocurrencies and their impact on the electronics market.

Chia has recently added mining pools, but given how quickly one’s investment could lose value, that may not be much help in the long term. This doesn’t look like a great way to invest one’s money.

Now Read:

July 12th 2021, 9:31 am

ET Weekend Deals: $80 Off Apple Watch Series 6, Dell G5 Core i5 and AMD Radeon RX 5300 Gaming Deskto

ExtremeTech

Today you can pick up an Apple Watch Series 6 smartwatch with a $70 discount. There’s also a deal on a Dell gaming desktop that comes equipped with an Intel Core i5 processor and an AMD Radeon RX 5300 graphics card for just $549.

Apple Watch Series 6 40mm GPS Smartwatch ($319.00)

Apple’s Series 6 smartwatch has built-in hardware for tracking your blood oxygen level and heart rate. These features as well as a built-in fitness tracker make the Watch Series 6 an excellent accessory for any exercise routine. This model is also up to 20 percent faster than its predecessor, the Watch Series 5. You can now get one of these watches from Amazon marked down from $399.00 to $319.00.

Dell G5 Intel Core i5-10400F Gaming Desktop w/ AMD Radeon RX 5300 GPU, 8GB DDR4 RAM, 256GB NVMe SSD and 1TB HDD ($579.00)

Dell built this gaming desktop with an Intel Core i5-10400F and a AMD Radeon RX 5300 graphics processor. Together, this hardware can run games with high settings at 1080p resolution. The system also has a unique front panel that looks cool and edgy, and you can get it now marked down from $929.99 to just $549.99 from Dell with promo code DTG5AFF28.

Dell S2721HGF 27-Inch 1080p 144Hz VA Curved Gaming Monitor ($229.99)

Dell’s S2721HGF gaming monitor was built with an ultra fast 144hz 1080p display panel that provides for a highly responsive gaming experience. The monitor is also curved, which helps to make games feel more immersive. Currently Dell is selling these monitors marked down from $279.99 to $229.99.

Featured Deals:

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 9th 2021, 6:30 pm

This Refurbished 8-Inch Samsung Galaxy Tab A is On Sale for 60 Percent Off

ExtremeTech

Getting the latest in personal computing technology can be an expensive endeavor, especially with most companies that produce computers and tablets perpetually improving, upgrading, and replacing their offerings. But it’s possible to snag a deal if you know where to look – and some of the best computing technology can be yours for a much more affordable price. Considering purchasing a refurbished device can help you to find what you need.

One such device is the Samsung Galaxy Tab A 8” 16GB – Black (Refurbished: Wi-Fi Only), which is now on sale for the reduced price of $119.99, a fantastic saving of 60 percent off the full ordinary purchase price. Nab a great deal on a quality refurbished device.

The Samsung Galaxy Tab A 8” 16GB is a high-quality tablet, perfect for getting through your daily computing needs. The eight-inch screen features a vivid and vibrant display of 800 by 1280 pixels in high color. The device features two quality cameras – a five-megapixel camera on the front and an eight-megapixel camera on the back, for optimal image capturing as well as video calling. The device has support for Wi-Fi built in so you can easily connect to any compatible network for internet capability, and it comes with 2GB of RAM and 16GB of internal storage to ensure you have a fast and smooth experience using the device. The table has Android 7.0 Nougat OS pre-installed and ready to go, running on a 1.4GHz Quad-core processor, all of which offers a simple and clean interface and user experience. The device has a “B” quality rating – indicating that it is in good working order with only light scuffing, scratches, or dents on the bevel, case, or body.

The Samsung Galaxy Tab A 8” 16GB – Black (Refurbished: Wi-Fi Only) is now on sale for just $119.99, a significant saving of $179.01 off the full purchase price of $299.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 9th 2021, 6:30 pm

Android Antivirus Apps Are Useless — Here’s What to Do Instead

ExtremeTech

Android has grown to become the largest computing platform on the planet, and that makes it a target. You can’t spend much time on the internet without hearing about some new piece of Android malware that’s going to definitely, 100 percent wreck your phone. These reports are always based in fact, but they can overstate the real risks of picking up a piece of malware, and the definition of malware can be quite vague. Security firms are usually pushing a virus scanning app of some sort. However, Android is by its very nature more secure than a desktop computer, so maybe you don’t need these security apps. You’ve probably already got what you need.

The Scare Tactics

In a 2019 report from AV-Comparatives, we learned that most of the antivirus apps on Android don’t even do anything to check apps for malicious behavior. They just use white/blacklists to flag apps, which is ineffective and makes them little more than advertising platforms with some fake buttons. Shocking and upsetting, right? They can get away with it because true Android viruses that take over your device are not as common as you’d expect. “Malware” can encompass milder threats like apps that harvest personal information or trigger pop-up ads. You still want to avoid those, of course, but malware scanners aren’t going to help.

Android and other mobile platforms have their roots in the modern era when programmers understood the dangers of the internet. We’ve all been programmed what to expect by PC malware, which can sneak onto your system simply because you visited the wrong website with a vulnerable browser. These “drive-by downloads” aren’t feasible on Android without a pre-existing infection. On Android, you have to physically tap on a notification to install an APK downloaded from a source outside the Play Store. Even then, there are security settings that need to be manually bypassed. That’s not to say it’s impossible for Android to have a severe zero-day bug that allows someone to sneak apps don’t your phone, but that would have to be an extremely delicate, costly operation. Unless you’re a diplomat or have high-level security clearance, it’s unlikely anyone would bother with such a scheme.

So, what about malware on the Play Store? Again, that depends on what you mean by malware. The most severe security risks will never make it into the store. Google’s platform has the ability to scan for known malware when it’s uploaded. There’s also a human review process in place for anything that looks even a little bit questionable. You might occasionally hear about some “malware” apps in the Play Store, usually related to information harvesting or advertising shenanigans. Google deals with these quickly, but anti-malware apps won’t catch this sort of thing.

The solution pushed by AV companies is to install a security suite that manually scans every app, monitors your Web traffic, and so on. These apps tend to be a drain on resources and are generally annoying with plentiful notifications and pop-ups. You probably don’t need to install Lookout, AVG, Norton, or any of the other AV apps on Android. Instead, there are some completely reasonable steps you can take that won’t drag down your phone. For example, your phone already has antivirus protection built-in.

What You Should Do to Stay Safe

Your first line of defense is simply to not mess around with Android’s default security settings. To get Google certification, each and every phone and tablet comes with “Unknown sources” disabled in the security settings. If you want to sideload an APK downloaded from outside Google Play, your phone will prompt you to enable that feature for the originating app. Leaving this disabled keeps you safe from virtually all Android malware because there’s almost none of it in the Play Store.

There are legitimate reasons to allow unknown sources, though. For example, Amazon’s Appstore client sideloads the apps and games you buy, and many reputable sites re-host official app updates that are rolling out in stages so you don’t have to wait your turn. Along with the Play Store, you also have Google Play Protect, which scans your apps for malicious activity. Updates to Play Protect roll out via Play Services, so you don’t need system updates to remain protected. In most cases, installing a third-party AV app just duplicates the work of Play Protect.

Users have been rooting their Android phones ever since the first handsets hit the market, but it’s less common these days. The platform offers many of the features people used to root in order to acquire. Using rooted Android is basically like running a computer in administrator mode. While it’s possible to run a rooted phone safely, it’s definitely a security risk. Some exploits and malware need root access to function and are otherwise harmless even if you do somehow install them. If you don’t have a good reason to root your phone or tablet, just don’t open yourself up to that possibility.

Some Android apps may not be “malware” per se, but you might not want them on your phone because they snoop through your data. Most people don’t read the permissions for the apps they install, but the Play Store does make all that information available. As of Android 6.0 and later, apps need to request access to sensitive permissions like access to your contacts, local storage, microphone, camera, and location tracking. If an app has reason to access these modules (like a social networking app), you’re probably fine. If, however, a flashlight app is asking for your contact list, you might want to think again. The system settings include tools to manually revoke permissions for any app.

It really just takes a tiny bit of common sense to avoid Android malware. If you do nothing else, keeping your downloads limited to the Play Store will keep you safe from almost all threats out there. The antivirus apps are at best redundant and at worst a detriment to your system performance.

Now read:

July 9th 2021, 4:29 pm

Windows 11 Will Support Rolling Back to Windows 10, but Not for Long

ExtremeTech

Microsoft took the wraps off Windows 11 recently, and we expect the new OS to arrive later this year. Upgrading to a new version of Windows is often a painful process, and in the past, you were stuck even if the new software ruined your workflow. It’s different this time: Microsoft says you’ll be able to go back to Windows 10 if you don’t like Windows 11. You’ll only have 10 days to decide, though. 

How will you know if Windows 11 is worth using? There’s a preview program for Windows 11, but the preview builds are still missing some elements of the final release. You don’t have to mess with the Insiders builds at all — you can install the final version when it’s available, and take it for a spin. 

This news comes by way of a PDF that Microsoft has provided to PC manufacturers. It’s an FAQ format, and among the various redundant queries is this gem: “Can I go back to Windows 10 after I upgrade if I don’t like Windows 11?” The answer is a resounding yes… for 10 days. You’ll have that long to decide to roll back to Windows 10. Wait any longer, and you’re locked into Windows 11 unless you reformat your system. 

If you opt to return to Windows 10, you’ll be able to do so with just a few clicks. The option is under Settings > Update & Security > Recovery. Simply select the previous Windows 10 build, and you’ll be right back where you were with all your apps and data intact. In the past, updating to a new version of Windows, even one of the biannual feature builds, would blank your old system restore points. 

An early version of the new start menu.

If you don’t even want to consider Windows 11, that’s also a supported option. Microsoft will support Windows 10 with updates through October 2025. Depending on how successful Windows 11 is, it might even have to extend Windows 10 support a bit longer. 

Windows 11 makes a number of major UI shifts including a new start menu without live tiles. You’ll have quicker access to pinned apps and recent files, but fans of Cortana (if there is such a thing) will be disappointed to see Microsoft’s voice assistant is being stuffed into an out-of-the-way app. Rounded corners also abound in the new software. The Windows Store will get a major update, as well, with support for more application types and payment methods.

Now read:

July 9th 2021, 1:57 pm

How to Upscale Star Trek: Deep Space Nine

ExtremeTech

I’ve written a number of articles about Deep Space Nine and my efforts to improve and upscale it over the past 16 months, but this is the long-promised tutorial — the explanation of how to do what I’ve done, in more detail than I’ve shared it before. If you’ve been waiting for the step-by-step explanation, this is what you’ve been waiting for.

 

Why I Don’t Use Terms Like HD, UHD, or 4K

The reason I don’t call this version of DS9 “DS9 4K” or even “DS9 HD” is because this isn’t HD-quality footage and nothing I’ve learned in the past 16 months can change that. Upscaling can only improve detail that already exists unless you use an approach that seeks to generate new detail — and I haven’t. Native DVD frames of Deep Space Nine are 720×480. That’s 37.5 percent as many pixels as a frame of 720p (1280×720) and 4.1 percent as many pixels as a frame of 4K content. Anyone who promises you a “4K” version of Deep Space Nine has smuggled film out of the studio and acquired some serious VFX chops or is talking up a resolution quality they can’t actually deliver.

Additionally, Deep Space Nine’s seasons are of highly variable quality. Seasons 1 & 2 are much worse than the rest of the show. While these can be improved, they still begin from a much lower baseline and benefit less.

Some of Deep Space Nine’s early special effects, especially external station shots and the credits, are in very poor condition on the DVD. The shot of Jake Sisko fishing on the holodeck in Emissary was done with a green screen whose detail transferred very poorly. I can only assume that whoever created the Season 1 & 2 DVDs really hated DS9’s pre-Dominion storyline because that holodeck background made it to disc with all the clarity of a mid-70s Playboy shoot.

I am willing to tolerate these acknowledged flaws in the DS9 Upscale Project because nobody I know of has had better luck repairing them. Improved color grading would undoubtedly help, but Seasons 1 & 2 look more like VHS tapes than DVD. Let me show you what I mean. I’m going to use an online tool to create easy comparison shots — unfortunately, I can’t embed its output in-page. “Resized” images are from the original DVD, blown up to 2560×1920 to compare against upscaled shots. Resized shots have been put through AviSynth to extract progressive frames but have not been processed with QTGMC.

Upscaled station vs. resized station: This shot shows how marginal some of the improvements are, due to problems with the underlying footage. There’s not enough detail to upscale in the first place.

Upscaled runabout vs. resized runabout: The upscale does nothing to harm the image, but it doesn’t exactly help much, either. Again, this is the result of the base footage, not the upscaler. The improvement is even smaller here.

Season 1 – 3 Credits versus Season 6 credits (resized): This shot isn’t between an upscaled version and a resized image. This shot shows two resized shots from the credits — one from Season 2 and one from Season 6, and it illustrates just how much the Deep Space Nine DVD quality improved over time. The resized DVD Season 6 credits look better than the upscaled Season 2 credits in the shot below.

Season 1-3 Credits versus Season 6 credits (upscaled): This is a comparison between the upscaled version of the same shot. The quality jump is not small.

This is a screenshot of the slider comparison above. To actually *use* the slider, you’ll have to click the link — but this is the difference between upscaling the S2 credits and upscaling the S6 credits with the exact same procedure. Click to enlarge.

But while the DS9 Upscale Project does not represent what Deep Space Nine would look like in native 720p, 1080p, or 4K, it represents the highest overall image quality I have achieved in 16 months of work. It is far ahead of any real-time method of improvement that I’m currently aware of. Achieving this level of quality requires the original DVDs and the same results are unlikely to be achieved if other sources are used. As I’ve said previously, my work is not an excuse or justification for pirating Deep Space Nine.

Because this is a tutorial, we’ll spend less time discussing the overall project and more time on the nuts and bolts of how you can perform this work yourself. From the beginning, I’ve promised that once I figured out a reliable enough method of performing this task to feel as if I could recommend it, I would do so. While this method is not absolutely perfect — if you zoom in, you’ll see Topaz creates some subtle blue lines around the logo that don’t exist by default — but the low-visibility error (especially at an 8-foot watch distance on a TV) is far outweighed by the degree of improvement.

This guide explains how to create the highest-quality version of Deep Space Nine I currently know how to create without requiring any frame-by-frame or scene-by-scene editing. This is a one-stop “best fit” line, but it’s a good best-fit line if you’ve got the horsepower for it. Because not everyone does, I’ll be publishing additional tutorials with advice on how to create a still-improved version of the show without requiring the same amount of GPU and CPU horsepower in the future.

The one piece of paid software this job relies on is Topaz Video Enhance AI. When I’ve compared TVEAI in the past to solutions like DVDFab, TVEAI continues to come out well ahead.

Be advised that TVEAI is still a work in progress. I would not recommend it as a paid product if I did not believe it capable, but capable and perfect are not the same thing. The product is young and the field is rapidly advancing.

What You’ll Need (Hardware)

The hardware requirements are vague because they represent more of a soft floor than a hard cutoff. A Ryzen 7 4800U mobile processor is capable of running the AviSynth script I’ll recommend in 3 hours, 45 minutes, and upscales one frame of video every 1.08 seconds or so on its integrated GPU. An RTX 2080 in Topaz VEAI 2.3.0 runs more like 0.4 seconds per frame. That’s roughly 19.5 hours of upscaling per episode for the 4800U and 7-8 hours for an RTX 2080.

There are several reasons why I recommend outputting your video as a series of PNG files rather than video. First, it allows you to stop and restart the encode. If you can afford to leave your system encoding overnight but need it during the day, you can use it to do this work in the off-hours, without needing to worry about losing a video file that was only half complete.

The second reason we use PNG files is that Topaz’s JPG output sometimes has visual errors that meaningfully affect final video quality. Storing an entire episode of DS9 in PNG files can require up to 280GB of storage and double that if you want to do a two-parter in one go. An SSD is not required, but I strongly recommend one.

Finally, TVEAI sometimes fails to preserve audio properly, requiring you to patch it in manually regardless.

350GB is actually a bit more storage than you should need for just the PNGs for an episode. 350GB should be enough for the VOB files, PNG files, AviSynth .h264 or .h265 intermediate files and the final output without requiring you to delete anything partway through the process to free space.

This tutorial will not work properly if you attempt it using previously-created MKV files or pirated files. The process is tuned to work exactly as written.

What You’ll Need (Software)

The only paid application on this list is Topaz Video Enhance AI, which is available for a free trial.

DVD Decrypter

Install DVD Decrypter and choose the region that corresponds to the disc set you’ve purchased.

Navigate to the “Mode” menu and click on “IFO.” This should give you a screen like the below:

Pick whichever episode you want to work on. DVD Decrypter will create a new directory for each DVD, but you’ll need to manually name the subdirectories if you want them to reflect the title of the episode in each. PGC 1 is the first episode on the disc, PGC 2 is the second, etc. You can choose to handle the entire DVD in one contiguous file, but the next step is somewhat simpler if you separate files here.

It typically takes 7-10 minutes for DVD Decrypter to do its work. Once you’ve finished with DVD Decrypter, you can close it. Closing the app sometimes leads it to throw an error (at least, it does on my own machine). You can kill it from Task Manager without harm if this occurs.

Output: VOB files, copied to your hard drive in a directory and a subdirectory organization of your choice.

DGIndex

DVD Decrypter created a file on your local hard drive at “C:\DS9S1D1” or “DS9S4D3” depending on which disc you started with. Navigate to the episode you want to work with and you’ll find a pair of files with names like “VTS_02_1.vob.” Launch DGIndex and navigate to “Open”, under “File.” Choose both VOB files from the appropriate directory and load them in order:

When finished, hit “Ok,” re-open the “File” menu, and choose “Save Project.” DGIndex will scan through your video file and complete its work. The end result of this will be a .D2V file.

Note: StaxRip sometimes likes to edit D2V files. In instances where a Deep Space Nine episode has a very high percentage of film (upwards of 95 percent), StaxRip will decide that the proper frame rate for your input video is supposed to be 23.976fps, not 29.97fps. As a result, running the script I recommend against an episode like “Emissary” will produce ~19fps video.

To avoid this problem, right-click on the D2V file you just created and choose “Properties.” Select “Read-only” and hit “OK.” StaxRip will yell at you about this problem when you load the file, but it won’t fail. Once you’ve set your project file to Read-only, we’re done with DGIndex.

Output: One .D2V file, to be kept in the same location as the .VOB files from the previous step.

StaxRip

Once you’ve created and locked your D2V file, launch StaxRip and load your D2V file by dragging and dropping it into the application. I’m working with the DS9 episode “Indiscretion” below:

This is what StaxRip will show you when you first load the D2V file. These are default outputs and settings.

There’s a lot going on here, but you don’t have to care about most of it. First, double-click on the “Mpeg2Source” window. This will open a sub-window where you can enter script parameters.

Unfortunately, I still don’t have a way to print the script I recommend easily. You can copy it below. Don’t worry about the fact that it isn’t completely visible; it will copy correctly if you grab a bit of text from the paragraph below, then delete it.

TFM(pp=5, mode=2, micmatching=3).TDecimate(hybrid=1)
QTGMC(ShutterBlur=3,ShutterAngleSrc=180,ShutterAngleOut=180,SBlurLimit=8,Preset="Very Slow", TR2=3, InputType=1,SourceMatch=3, MatchEnhance=1.0, MatchPreset="Very Slow", TR0=2, TR1=2, MatchPreset2="Very Slow", sharpness=0.6, SMode=2, Rep0=22, Rep1=22, Rep2=22, RepChroma=True, Sbb=3, SubPel=4, NoiseProcess=1, ChromaNoise=True, GrainRestore=0.5, NoiseRestore=0.25,DenoiseMC=True, NoiseTR=2)
pSharpen(strength=50, threshold=90, ss_x=4.0, ss_y=4.0)
MAA2(mask=1, chroma=true,ss=4, aa=128, aac=128, threads=8, show=0)
LSFmod(defaults="slow", strength=50, Smode=5, Smethod=3, kernel=11, preblur="ON", secure=true, Szrp=16, Spwr=4, SdmpLo=4, SdmpHi=48, Lmode=4, overshoot=1, undershoot=1, Overshoot2=1, Undershoot2=1, soft=-2, soothe=true, keep=20, edgemode=0, edgemaskHQ=true, ss_x=4.0, ss_y=4.0, dest_x=704, dest_y=480, show=false, screenW=1280, screenH=1024)

One way to improve your performance is to add the line “Prefetch(x)” to the bottom of the script, where “x” stands for the maximum number of threads you want to run. Sane values are 2-8. Lower values will leave you more CPU performance for other things. If you intend to use your PC while encoding, set this value to ~50 percent of your cores or less. Be aware that values larger than 8 are generally more likely to cause instability than to improve performance. If I wanted to encode two files simultaneously while also using a 16-core PC, I’d launch both StaxRip runs with a value of Prefetch(4) and leave myself eight threads for other tasks.

To control your output formula, look to the right-hand side of the image. Click on the “x265” text, and a menu will open, offering you various other codecs and output options.

If you want to adjust those options to fine-tune them, clock on the “Options” button. This will open an additional sub-menu:

x264 options. I recommend “Very Slow” and a quality level of “6”, though the latter is a bit arbitrary.

If you want to change your final container options, click on the blue text labeled “MKV.” This will give you an array of choices. Personally, MKV files work fine for me. StaxRip detects both audio tracks and preserves both through the encode process, so you don’t need to change these settings unless you specifically want to encode into different formats. Click “Next” and then hit “OK” to start the process.

This is what your screen should show after loading the AviSynth script. Pay attention to the output frame rate (23.97602) and the x264 / x265 presets. Resize resolution should be 704×480, not 720×480. Don’t worry about the 10 percent error metric, final output will be appropriately sized.

Pro Tip: If you are experimenting with script settings, don’t render an entire episode to do it. You can stop StaxRip at any time by hitting the “Abort” button. Once the application returns to the main page, hit “Next” again. The application will tell you that a Video and Audio encoding already exists and ask if you want to re-use the video you already rendered or re-encode it. Select “Re-use.”

This will create an MKV file of whatever partial episode you rendered out for test purposes. You can plug this into Topaz VEAI and check the output. Just remember to set the final rendering frame manually. Otherwise, you’ll render a ton of blank, black frames. They might upscale beautifully, but you’ll never be able to tell.

Output: One MKV file, assuming you are fine with StaxRip’s default options. If not, one MP4, MOV, or another output file of your choice and configuration. DS9 files encoded with these presets should be between 3GB and 5GB depending on how much noise you inject. Injecting noise will make your final output larger. This is your pre-processed, pre-upscale version of Deep Space Nine.

Topaz Video Enhance AI 2.3.0

Once you’ve started TVEAI, navigate to the directory where your MKV output from the previous step is saved. Open the file. You can use the { and } options to select the section of the video you wish to render. If you wish to upscale the entire file, you do not need to bother with these settings. Switch the “AI Model Selector” section at the top of the right-hand side of the application to “All” from “Suggested.” Choose “Artemis High-Quality v11” and “400%” below that.

Scroll down and choose your grain settings. I recommend between 2.4 – 2.7. Above 3.0 gets pretty heavy. I recommend that you output to PNG files. TVEAI can output video files in ProRes at 4:2:2, but the final output files are quite large for video files and you’ll wind up compressing them again or taking them apart in FFmpeg and rebuilding them with less obnoxious size requirements anyway. Set TVEAI for PNG output and you can stop the upscaler at will, use your GPU for other things, and then fire it up again when convenient.

Output: A series of PNG files, typically created in a subdirectory of wherever your MKV file is stashed.

FFmpeg

FFmpeg is a command-line utility you can use to take your PNG files and wrap them back up into a video file. Once you’ve downloaded FFmpeg, you have two options. You can add it to your Windows PATH (good idea, best security practice) or you can do what I did, and dump the executables directly into your C:\Windows\System32 folder. There are all sorts of good reasons not to do this related to security and best practices and not encouraging end-users to muck around in folders they ought not to touch, so I’ll leave it up to you which method you prefer.

It’s not as bad as it might look, I promise.

Once TVEAI is finished with its work, visit the original subdirectory where you dropped the VOB files and from which you presumably ran StaxRip. You should find some audio files with the extension *.ac3. Feel free to rename them to something more plausible, like “Indiscretion-2ch.ac3” and “Indiscretion-5ch.ac3.” The file with a larger Kbps number is your 5.1 audio, the smaller number is your 2-channel audio. You can pick which audio you want to encode as your primary channel depending on your preference on this topic.

Copy both of these files into the same directory where your PNG files are stored. An upscaled episode of DS9 contains ~65,143 PNG files.

Open a command-line prompt by hitting the Windows key and typing CMD.

Open the subdirectory where your PNG files and audio files are stored. Click in the address bar and hit Ctrl-C to copy the address.

Return to your open command prompt. Type “cd” and hit the space bar once, and then paste the address you just copied. If you are working off a different drive (D instead of C), you will need to type “d:\” and hit enter before beginning this sequence of commands. If you use default names for your files and subdirectories, your final output will be in a directory named something like: “C:\DS9S6D2\Sacrifice_4.00x_2560x1920_ahq-11_png”

Once you’ve navigated to the appropriate directory, modify the following command to fit your own requirements. This command can be copied directly off the web page because there are no smart quotes involved (smart quote formatting is what wrecks AviSynth scripts in our backend). In the commands below, replace “Indiscretion-2ch.ac3” or “Indiscretion.mkv” with the file names of your own episodes and their associated audio tracks.

ffmpeg -r 23.97602 -f image2 -s 2560×1920 -i %06d.png -i Indiscretion-2ch.ac3 -vcodec libx264 -crf 16 -pix_fmt yuv420p -acodec copy Indiscretion.mkv

The name of the .ac3 file should be replaced with the name of the audio file you want to use. If you want to keep both audio tracks, we split the process up into steps. There’s undoubtedly a way to do this in a single ingestion, but I haven’t quite nailed the syntax for that yet. Start with:

ffmpeg -r 23.97602 -f image2 -s 2560×1920 -i %06d.png -vcodec libx264 -crf 16 -pix_fmt yuv420p -acodec copy Indiscretion.mkv

This will compile your frames into an MKV file, but you won’t have any audio. When the video file has finished compiling itself, type:

ffmpeg -i Indiscretion.mkv -i Indiscretion-2ch.ac3 -i Indiscretion-5ch.ac3 -map 0 -map 1 -map 2 IndiscretionCombined.mkv

This will compile your final file and output with multiple audio files. If you are upscaling at 200 percent instead of 400 percent, make the appropriate resolution changes to the “-s 2560×1920” variable. If you want to use H.265 instead of H.264, replace “libx264” with “libx265.” Lower or higher compression factors can be set by adjusting the crf number.

Bonus Round: Subtitles

If you want subtitles, you can have them. There’s an additional step here involving a program called CCExtractor and an application called MKVToolnix.

To use CCExtractor, run the application and drag both of your VOB files into it, as shown below:

Navigate to the far right-hand side of the tab list. You can’t actually see the “Execute” tab you’re looking for by default and there’s no way to widen the application enough to include it by default. Ah well.

Now you have a .SRT file and can install MKVToolNix. MKVToolNix will finish this entire procedure up for us.

Drag your completed MKV file sans subtitles and your subtitles file into the application. MKVToolNix will ask you what kind of file you want to create. Choose to add a new source file to the current multiplex settings, as shown below.

Choose these options and select your final file name. “Start Multiplexing.” Rejoice! You have finished an episode while retaining both audio tracks and subtitles.

How to Adjust StaxRip Image Quality

The only way to get exactly the output I’ve shown throughout this process is to follow my exact steps. But that doesn’t mean there isn’t room to experiment. If you feel as though the final output is inappropriately sharpened (or not sharp enough), adjust the Sharpness setting in the QTGMC script. You can also experiment with removing the “psharpen” and “MSAA” calls altogether. I would strip LSFMod out last — there’s resize code embedded in it that outputs in 704×480, resulting in a proper final file size of 2560×1920 as opposed to the over-stretched 2618×1920.

If you feel as though your final output is too plastic-looking, there are several ways to adjust for this. Decrease sharpness in QTGMC downwards to 0.2 and change the “strength” variable of LSFMod downwards. You can also try injecting more grain and noise in the final output. If you choose to adjust NoiseRestore above 0.5, remove the “ChromaNoise=True” line from the QTGMC script. If you don’t, the output will turn slightly greenish. The effect doesn’t get very noticeable until above 0.75, but there’s no reason to leave it there in the first place.

Increasing noise output in QTGMC may slightly improve overall image quality in Artemis-HQ, but the effect is not as large as it was in previous versions of Gaia-CG. It can still improve background starfields, however.

No Longer Left Behind

This article will be updated as my understanding of best practices continues to expand. Readers have requested that I take a look at Voyager, to see how best to adapt Deep Space Nine’s processing methods to that TV show. In the future, I’ll be swapping the Defiant for the Delta Quadrant. I’ve tossed a few screenshots into a slideshow below, for those curious to see how Voyager holds up under the same processing methods, though I’ve already experimented with the best way to tweak them. VOY was mastered with higher-quality footage than most of DS9 and the difference is obvious when you put them side by side. Consider this a preview (you can right-click any image and choose “Open in New Tab” to see the frame full-size:

 

Am I “done” with Deep Space Nine?

Yes — and no.

I’m still considering color grading, but DaVinci Studio Resolve can’t edit or create MKV files, and the approach laid out above is based on them. Regrading color is also a new field for me and I can’t make any promises about how quickly I’ll be able to tackle it. Color grading wasn’t contemplated in the original project as I created it, and I do not wish to delay this tutorial even longer while I learn. I’m experimenting with frame-by-frame editing and post-processing using additional tools. These concepts are all outside of the original project as I defined it.

In the beginning, I set out to create a method of upscaling Deep Space Nine that would do the show justice compared to Netflix or even what you could buy on DVD. I have no doubt that the quality of the show will improve as tools such as Topaz Video Enhance AI improve. Nevertheless, I have reached a point where I feel comfortable recommending this process to people.

One day, footage of the quality I’ve shown throughout this project will be available in real-time. By the time it is, I’ll have built something better through the use of non-real-time rendering. I spent 20 years waiting for ViacomCBS to tackle this project and I don’t intend to take my eye off it until we either have a native 4K version of the show build directly by ViacomCBS or third-party AI tools have advanced to the point to render such a version redundant.

But the goal of the Deep Space Nine Upscale Project was never to wait until 2030 when the perfect tools become available. I created this project once it became clear that AI tools and free utilities like AviSynth could be used to create a version of Deep Space Nine that would be sufficiently better than baseline to be worth the effort of creating in the first place. This tutorial was always the story I wanted to write. Now, I’ve written it.

Different people will reach different conclusions on whether the show deserves this kind of effort. That’s okay. Every time I publish one of these articles, someone different pops out of the woodwork to say how the Netflix version or the baseline DVDs are fine. I’m glad people feel that way. As someone who loves Deep Space Nine, I want people to enjoy the show. If you’re happy with the version on TV without spending a few hundred (or thousand) hours improving it, I’m glad for you.

I wasn’t.

Now, nobody else has to be, either.

If you think you can beat my quality, do it. Let me know where the quality improvements come from and I’ll include them here and credit you.

I still hope ViacomCBS remasters Deep Space Nine. I don’t think it will surprise anyone at this point to say I’d love to be part of such an effort. I’ve been asked how I’d feel if a remaster was announced next week and my honest answer would be “thrilled.” I believe if such a thing happens, it’ll be because my own project and others like it collectively illustrated to Paramount that there was an appetite for a better version of the show. I’m also willing to bet it’d sell better than TNG did at its Blu-ray debut if Paramount 1). Committed to doing both VOY and DS9, 2). Released both shows on disc and on ParamountPlus, and 3). Didn’t try to sell it for $80 a season.

If you love Deep Space Nine enough to even contemplate this project, you’re the person I had in mind when I pitched this story to my EiC, Jamie Lendino. It is not linear. The things we love rarely are. My most heartfelt thank-you to the multiple individuals who worked on Deep Space Nine’s VFX team and have contacted me throughout this process to offer insight and support.

Other stories and assignments may take me to the depths of the Delta Quadrant or bury me in CPUs but I will stand with you again, here, in this place where I belong.

Hruska out.

Now Read:

July 9th 2021, 1:57 pm

ET Deals: $100 Off Samsung Galaxy S21 5G Smartphone, Dell Vostro 15 5510 Intel Core i5 Laptop for $6

ExtremeTech

The Samsung Galaxy S21 5G is Samsung’s current flagship smartphone with performance that rivals the best phones money can buy. Today you can get the 128GB model of this phone with a $100 discount that drops the price to just $699.99.

Samsung Galaxy S21 5G 128GB Unlocked Smartphone ($699.99)

Samsung’s Galaxy S21 5G smartphone features top-tier performance that rivals the best phones currently on the market. It comes equipped with a trio of cameras including one that can record video in 8K, and it also has a 6.2-inch Dynamic AMOLED display as well as a long lasting 4,000mAh battery. This phone sells regularly for $799.99 but you can buy one today from Amazon for just $699.99.

Dell Vostro 15 5510 Intel Core i5-11300H 15.6-Inch 1080p Laptop w/ Intel Iris Xe Graphics, 8GB DDR4 RAM and 512GB NVMe SSD ($699.00)

This business class system sports an Intel Core i5 processor that excels at running multiple applications at a time. The laptop is also relatively light at just 3.67 pounds. For a limited time you can get this system from Dell marked down from $1,327.00 to just $699.00.

Netgear Nighthawk R6700 802.11ac AC1750 Smart WiFi Router ($79.99)

The Nighthawk R6700 is one of the most popular Wi-Fi routers on the market. It offers reliable performance with speeds of up to 1,750Mbps across two bands. It also has built-in USB ports for adding network resources. Right now it’s marked down from Newegg from $99.99 to $79.99.

Apple AirPods Pro ($197.00)

Apple’s AirPods Pro utilizes a new design that’s different from the company’s older AirPod earphones. The key new feature that these earphones have is active noise cancellation. Each earphone also uses a custom driver and a high dynamic range amplifier to improve sound quality. You can snag them with a $60 discount from Amazon that drops the price from $249.00 to $197.00.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 8th 2021, 5:56 pm

Microsoft’s Cloud PC Promises to Put Your PC in Your Pocket

ExtremeTech

Microsoft is reportedly preparing to unveil its new Cloud PC service. It’s expected to debut next week, possibly during Microsoft’s Inspire conference, and it could change the way we access PC computing — at least in some cases.

This news comes from ZDNet, courtesy of longtime Microsoft watcher and prognosticator, Mary J. Foley. Cloud PC is a distinct service from Microsoft’s Azure Virtual Desktop. Azure Virtual Desktop is intended for enterprise users who are heavily plugged into Microsoft’s other cloud services and are charged partly based on usage. Cloud PC, in contrast, would be a fixed, monthly fee. Foley reports that Microsoft wants to sell Cloud PC as a “managed Microsoft 365 experience.” It was rumored last year that Microsoft would charge a different fee for different system configurations, described as “Medium,” “Heavy,” and “Advanced,” but this has not yet been substantiated.

The idea of a subscription cloud service that gives you access to a PC any time you need one isn’t something most PC enthusiasts are going to go for on a regular basis, but PC enthusiasts probably aren’t the market for this product in the first place. If you consider yourself a PC enthusiast, the PC is probably where you spend a significant amount of your total electronics time. Cloud PC sounds as if it’s designed for end-users who need access to a PC only occasionally or in an emergency.

Cloud PC (assuming it launches as described) is another piece of Microsoft’s transition from a company centered around selling you boxed copies of Windows and Office. Microsoft is pushing hard to be “Microsoft-as-a-Service”, where all of its products are available as part of an ecosystem. Today, for example, you’re either a PC gamer or an Xbox gamer. Your access to the Xbox and PC ecosystems is mediated by whether you possess the appropriate device. Xbox Cloud Gaming and Game Pass are both intended to attack that problem. So is Cloud PC. Cloud PC just takes the idea one step farther and gives you access to the entire machine.

This is a service intended, I think, for people who only need to access a PC or high-end PC performance occasionally. Exactly how good a deal this is depends entirely on how much Microsoft charges for it. I am rather stingy when it comes to what I’d pay for a subscription service like this, but an accessible cloud PC for $5-$10 per month could be reasonable, depending on the specs of the PC in question. In theory, it might even be a better deal than buying a machine. What if you normally compute from a relatively svelte laptop, but you want to tap a high-end system for some media creation or 3D rendering? Renting access to a high-end PC for a month or three would be a hell of a lot cheaper than buying one outright.

If your PC is the center of your work or entertainment life, you probably wouldn’t spring for a Cloud PC. But we suspect the service is intended to address the millions of end-users who are smartphone or tablet-first and who encounter Windows PCs only on the periphery of their own lives. For such customers, a Windows PC they can log into occasionally at a small price per month may be far more practical than a machine they purchase and use only once in a while.

Now Read:

July 8th 2021, 5:56 pm

Save 60 Percent Off this VPN Unlimited and PlayStation Plus bundle

ExtremeTech

Anyone with a proclivity or passion for technology is always looking for a way to make the time they spend online more convenient, more worthwhile, and safer. Given the way the internet continues to grow in sophistication, and internet users are far from always benevolent in their actions, it pays to think about the way you spend your time online, and the ways in which you secure your data. On the more fun side, subscribing to certain services can grant you a whole range of incredible features that are inaccessible without those subscriptions.

Right now, you can get a nifty bundle that can both make you sager online, and also improve your gaming experience hugely. The VPN Unlimited Lifetime + PlayStation Plus 2-Yr Subscription Stackable Code Bundle is now on sale for the low price of $89.99 when you use coupon code GETPLAYSTATION when you’re checking out, a wonderful saving of 72 percent off the full purchase price of $318.

Enabling a virtual private network (VPN) on your devices ensures you have some of the best protective services enacted to protect your information. VPN Unlimited is one of the top-rated VPN services, and this subscription allows for two years of protection across five devices, be they smartphones, desktops, tablets, laptops, and more. Unlimited VPN allows you to browse without limits and make use of VPN servers in more than 80 global locations, from the USA and the UK to Canada, Australia, and Hong Kong.

The bundle also includes two stackable one-year memberships to PlayStation Plus, the essential upgrade for PlayStation players that allows you to play online with a huge community of gamers using popular games like Star Wars: Battlefront, Uncharted, and more. The service has a stellar rating of 4.8 stars out of five on Amazon, and it even offers two free games per month, exclusive discounts and deals, and cloud storage to upload your save and character profiles so you can play from anywhere.

The VPN Unlimited Lifetime + PlayStation Plus 2-Yr Subscription Stackable Code Bundle is on sale for just $89.99 when you use coupon code GETPLAYSTATION, saving you $228.01 off the regular purchase price.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 8th 2021, 5:56 pm

China’s Tencent Adds Face Scanning to Monitor Children Gaming

ExtremeTech

China takes fears of video game addiction among kids very, very seriously. It’s one of several countries that enforces a “gaming curfew” for those under 18, and now gaming heavyweight Tencent has implemented government-approved ID checks in dozens of popular mobile titles. These games will now use facial recognition to make sure no one underage is playing games too late at night. This is only the start—Tencent says its “Midnight Patrol” technology is coming to more games soon. 

Back in 2019, the Chinese government rolled out the curfew, which requires players to sign in with an ID number and real names. It became clear pretty quickly that kids who wanted to keep playing popular mobile games at all hours were finding ways. For example, they could use an older family member’s login or borrow a phone from someone else. It’s going to be harder to skirt the rules now that Midnight Patrol is rolling out. 

The first wave of games with Midnight Patrol includes some that are hugely popular with Chinese players, like Honor of Kings, but few of them have any presence outside the country. When a gamer fires up one of the affected titles during the nationally mandated curfew, they’ll have to submit to a facial recognition scan. The game will also interrupt the player after a period of time to do another scan, ensuring the authorized adult didn’t just hand the device to someone underage. 

There’s a lot we don’t know about how this system works. Is it just estimating age based on facial characteristics? There are plenty of people who look much younger than they are. Perhaps the system is matching facial scans with an existing database to verify identities? China is famously obsessed with facial recognition, which it uses to monitor the populace. These efforts are particularly blatant in areas like Xinjiang where the government has been accused of oppressing Muslim minorities. 

Regardless of how the system works, even Chinese gamers who have become accustomed to constant tracking are expressing unhappiness. Some don’t trust the company’s software and consider it a privacy violation. Others believe Tencent should not be stepping in for the parents who are the ones responsible for limiting game time. Judging by the tone of Tencent’s announcement, it won’t change course. The penalties for companies that fail to keep kids offline during curfew are steep, and Tencent is nothing if not a good corporate citizen in China.

Now read:

July 8th 2021, 4:11 pm

Microsoft Is Developing an AI Upscaling Engine to Improve Xbox Visuals

ExtremeTech

Microsoft is apparently planning its own AI upscaling method, possibly in the same vein as Nvidia’s DLSS and AMD’s FSR. The company hasn’t made any direct announcements, but it has put up two job offers, one for a senior software engineer and the other for a principal software engineer for graphics.

The senior software engineer position notes that “The Xbox graphics team is seeking an engineer who will implement machine learning algorithms in graphics software to delight millions of gamers. Work closely with partners to develop software for future machine learning hardware.” The principal software engineer job description is more general, but the senior software engineer’s responsibilities list such tasks as “research and prototype machine learning algorithms” and “implement software that incorporates machine learning algorithms for graphics in shipping hardware.”

AI Upscaling Is the Future of Graphics

Microsoft could be going a couple of ways on this. A robust AI upscaling method or procedure, folded into a future DirectX standard and made generally available to developers everywhere, would reach a larger audience than AMD or Nvidia does. Alternately, Microsoft could be looking to develop a proprietary solution it can use in the Xbox Series S|X or future hardware.

But one thing I’m absolutely certain of is that AI upscaling is the future of graphics.

It’s not an accident that Nvidia introduced ray tracing and started working on DLSS at the same time. Ray tracing is difficult and compute-intensive whether you use dedicated hardware (Nvidia) or bulk up with more compute cores (AMD). GPU manufacturers are also up against a wall on power consumption.

This is where upscaling methods can shine. If AMD, Nvidia, or apparently Microsoft can master the art of making 1080p look like 4K in half the power consumption or less, the additional power can be devoted to ray tracing or compute.

Eight years ago, I covered the emerging field of imprecise computing. What the researchers found was that building deliberately imprecise processors could substantially reduce power consumption. AI upscaling is essentially an application of the same theory, namely: Rather than attempting to be perfect, focus on being good enough. Good enough, it turns out, can sometimes boost frame rates by 30 percent or more. That’s typically equivalent to upgrading to a new GPU in a higher price bracket than your current one.

Data and image by Igor’s Lab

AI upscaling that could convert 1080p into a nearly indistinguishable 4K, in real-time, ought to be far more power-efficient than actually doing the work of drawing it. Igor’s Lab has actually measured the impact of using DLSS as opposed to native 4K. With DLSS enabled, the RTX 2080 drew 80W less power. Typically you only see this kind of improvement when comparing the same GPU built on two different process nodes. Given these kinds of improvements, it would be surprising if Sony and Microsoft weren’t working on these ideas.

None of this is to say that rasterization performance won’t continue to improve. It absolutely will. But AI upscaling will also continue to evolve, both in software and hardware. The quality improvements in Topaz Video Enhance AI over the last 16 months have been significant and the application now runs much faster than it did.

The coolest thing about technologies such as AMD’s FSR and Nvidia’s DLSS is that they’re evolving as quickly as they are. AI is still in its infancy. I don’t think it’ll be more than a few years before the quality uplift I’ve given Deep Space Nine is available in real time to anyone watching the show. Currently, it takes hours to pre-process and then upscale a single episode. Five years from now, the same uplift — or better — may be available as a toggle switch on your TV or PC.

Now Read:

July 8th 2021, 10:26 am

Scientists: Methane in Enceladus Geysers Could Come From Alien Life

ExtremeTech

Saturn’s moon Enceladus doesn’t get as much attention as Europa, but it too has an internal liquid water ocean. It may also be more active than Europa with frequent geysers erupting from the surface. Years back, when NASA’s Cassini-Huygens probe explored the Saturnian system, scientists were fascinated to see how much methane was present in Enceladus’ geysers. At the time, it seemed feasible the methane was naturally occurring, but that’s less likely now that new research has ruled out all the known geochemical processes. That leaves biological sources—life—as a leading possibility. 

Saturn is more distant than Jupiter, and thus its moons are even more frigid. However, like Jupiter, Saturn has a lot of gravity. As its moons orbit, the tug of Saturn’s gravity is so intense that it stretches the moon’s crust. This flexing heats the interior, a process known as tidal heating. Add together geological activity and a liquid ocean, and you have a perfect recipe for hydrothermal vents. Cassini-Huygens detected several compounds in the geyser plumes (including methane) associated with hydrothermal vents on Earth. 

While it’s possible to produce methane through non-biological processes, it’s also a common byproduct of biological metabolism. That’s why NASA and other space agencies are so keen to track down the source of methane wherever it pops up. For example, Curiosity and the Trace Gas Orbiter have been gathering data on Mars’ methane spikes for several years. While there are no missions currently exploring Enceladus, researchers from the University of Arizona were able to develop a mathematical model to try and determine what’s likely to be happening inside the moon. 

Geyers erupting from Enceladus

First, the team had to confirm that the volume of dihydrogen in the plumes was sufficient to sustain a population of Earthlike hydrogenotrophic methanogens. That is: organisms that metabolize hydrogen and excrete methane. Next, they had to assess the potential impact of temperature and how that would affect the escape rate of methane and dihydrogen. In the end, the model says that no known geological process can explain the volume of methane coming out of Enceladus. 

That doesn’t mean there’s definitely alien life clustered around the hydrothermal vents of Enceladus. It’s also possible there is an abiotic process that we don’t know about that can produce enough methane to match what we see in reality. For example, a large pocket of methane-bearing materials could have been trapped in the moon’s crust as it formed. All we can say for sure is the chances of life on Enceladus are somewhat higher than we might have estimated a few years ago.

Now read:

July 8th 2021, 10:26 am

Intel Discontinues Lakefield, Its First x86 Hybrid CPU

ExtremeTech

When Intel launched Lakefield back in June 2020, it felt like the beginning of something new. ARM may have deployed big.Little a decade ago, but Intel and AMD had previously stuck to multi-core CPUs built on a single CPU design. Lakefield combined one Ice Lake “big” CPU cores with four Tremont “little” CPU cores. Lakefield targeted the low-power market, with a TDP of just 7W, and was intended to compete with ARM products featuring similar power consumption envelopes. Now, Intel has announced an early retirement for the CPU. Lakefield is entering end-of-life barely a year after launch.

Intel’s explanation for the cancellation is straightforward: “Market demand for the products listed in the ‘Products Affected/Intel Ordering Codes’ table below have shifted to other Intel products.” But Lakefield was unique, as far as Intel was concerned. This isn’t the first time Intel has struggled to compete in the low-power market.

Intel’s power/performance curves for Lakefield versus Ice Lake.

In the run-up to the Windows 8 launch, Intel expected its Clover Trail platform to quickly seize market share in what was supposed to be a rapidly expanding landscape for Windows 8-based tablets and convertible devices. The explicit thinking at the time was to leave the low-end market to ARM and establish Intel as the upper-tier vendor of choice for x86 tablets.

Part of the reason this plan failed is undoubtedly due to the poor reception Windows 8 received, but Clover Trail-powered devices were often priced similarly to mainstream, more powerful x86 laptops. Intel took had an x86 tablet business for a little while, but it sustained its market share by shipping products contra revenue. It was a deliberate choice to lose money on shipments in exchange for building market share. When Intel moved away from this strategy, the company’s tablet share fell swiftly.

Lakefield may have suffered from some of the same problems. Intel and its OEM partners sold the chip into expensive systems priced between $800-$1,200. That sets the chip up to compete against high-end Intel and AMD systems with much larger batteries and far better performance. Compared with these systems, Lakefield can’t win. Preliminary data suggests Windows 11 performs better on hybrid CPUs, but that wasn’t enough to make Lakefield viable, apparently.

Alder Lake will be Intel’s second stab at the hybrid model, this time with up to eight “big” cores and small cores based on Gracemont. Gracemont is expected to field a 64KB L1 cache (up from 32KB on Tremont) with DDR5, PCIe 4.0, and AVX/AVX2 support.

Intel may have better luck positioning Alder Lake than Lakefield. Switching to little cores for idle power should save energy compared with conventional Intel CPUs, and the systems OEMs build are less likely to face off with hardware that dramatically outclasses them at the same price point. The fact that Lakefield is being canned after such a short time suggests very little uptake or interest in the chip. That doesn’t mean Alder Lake will run into similar problems, but it speaks to a consistent issue Intel has had winning market share with low-power chips. It’s not that Intel can’t build them. It’s that they don’t seem to wind up in the hardware people actually want to buy at competitive prices.

Now Read:

July 8th 2021, 10:26 am

This Password Manager and VPN Bundle is On Sale for Over 90 Percent Off

ExtremeTech

You know that most of the passwords you use are weak by any measure, and you know that you should invest in some way to keep them safe – but it’s something that you just never seem to get to. It’s always saved for later on when you have more time. But the more time you procrastinate instead of investing in a secure way to save, store, and input passwords, the more likely you are to be hit by a hack that can compromise your information, software, computing equipment, social media, or even bank accounts. But it’s actually not that difficult to get something set up that will protect your information on your behalf.

A password manager is a secure way to start, The Lifetime Password Manager & Privacy Subscription Bundle is currently on sale for just $29.99, a handy discount of 92 percent off the full purchase price of $398. Keep your information and your family safe on the internet with this small and cost-effective collection of tech that will protect you while you’re connected to the internet.

The Lifetime Password Manager & Privacy Subscription Bundle contains two key security products. First is the lifetime subscription to Sticky Password Premium, an award-winning password manager which means you don’t even need to remember the passwords to most sites and software – just remember how to log in to Sticky Password and everything will be there for you. Sticky can also help you create strong passwords which can prevent hacking, as well as enable you to securely share passwords with those who need access.

The second part of the bundle is a lifetime subscription to VPN Unlimited for five devices, allowing you to connect safely and securely to the internet with a hugely decreased risk of data loss or theft.

The Lifetime Password Manager & Privacy Subscription Bundle is now on sale for $29.99, offering a wonderful discount of $368.01 off the full purchase price of $398.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 7th 2021, 6:26 pm

ET Deals: $150 Off Apple MacBook Air M1 Chip, Lenovo Chromebook Flex 5 Intel Core i3 1080p Touchscre

ExtremeTech

Apple’s MacBook Air laptops equipped with the company’s revolutionary M1 chip offer excellent performance for a wide range of tasks, and you can get one now with an $150 discount.

Apple MacBook Air M1 Chip 13.3-Inch Laptop w/ 8GB RAM and 512GB SSD ($1,099.99)

Apple’s latest MacBook Air comes equipped with Apple’s new M1 SoC, which contains an 8-core processor that’s reportedly 3.5 times faster than the hardware inside of the preceding model. Apple said the system can also last for up to 18 hours on a single charge, giving you all the power you need to work from sunrise to sunset. Now for a limited time you can get one of these systems from Amazon marked down from $1,249.00 to just $1,099.99.

Lenovo Chromebook Flex 5 Intel Core i3-10110U 13-Inch 1080p Touchscreen Chromebook w/ 4GB DDR4 RAM and 64GB eMMC Storage ($311.24)

Lenovo built this system as a high-end Chromebook with an Intel Core i3 processor and a 1080p touchscreen display. The notebook is exceedingly compact measuring just 0.67-inches thin and weighing 2.97 pounds, and the screen can rotate 360 degrees enabling you to use the system as a laptop. For a limited time you can get one of these systems from Amazon marked down from $429.99 to just $311.24.

Roku Express 4K+ ($29.49)

The Roku Express 4K+ is one of Roku’s newest streaming devices that’s capable of streaming 4K content. It also has support for HDR and has an easy to use interface that makes finding what you want to watch quick and easy. Currently you can buy one from Amazon marked down from $39.99 to just $29.49.

Dell Vostro 3681 Intel Core i5-10400 Desktop w/ Intel UHD Graphics 630, 8GB DDR4 RAM and 256GB NVMe SSD ($549.00)

This compact desktop PC comes equipped with an Intel Core i5-10400 processor that gives it solid performance. In general it would work well as an office PC or home desktop. The system has a 256GB NVMe SSD as it’s main storage device that allows it to boot quickly. It also has a DVD-ROM drive and it has four USB ports on the front to make connecting devices easy. Currently you can get this desktop from Dell marked down from $998.57 to just $549.00.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 7th 2021, 6:26 pm

Newegg Won’t Sell You Certain Components Unless You Pay for Its PC-Building Service

ExtremeTech

Newegg has launched a new high-end PC-building business called ENIAC. For an additional $99, the company will build and ship your system. The company claims to have identified a subset of customers that want to pay for this feature and while I don’t ever see myself doing it, I can see why.

But Newegg isn’t just giving customers the option to buy PC components. It’s specified that certain products labeled “Hot Items” will only be sold to buyers who take advantage of ENIAC. Here’s the company’s full quote:

It’s hard to read this any other way besides “Newegg wants you to pay it a lot more money.” Hot items are unlikely to be full PCs, so we’re talking about individual components, such as SSDs, RAM, CPUs, and GPUs.

Shakedown or Service?

Offering to build a customer’s PC before shipping it to them is a great option, provided systems are assembled well. Requiring customers to buy a bunch of components they do not need in order to buy a new graphics card is one of the worst examples of bundling we’ve ever seen. It’s annoying when retailers force gamers to buy extra controllers and games during a console launch, but being forced to buy a case, motherboard, RAM, CPU, and power supply just to buy a single component?

Not to put too fine a point on it, Newegg, but are you nuts?

There’s no way to interpret the above in a positive light. Newegg’s “hot item” policy specifies that it may restrict purchases of certain components when demand is high and force customers to buy full-blown PCs. There are no circumstances under which this is a reasonable requirement. People turn to the retail channel specifically because they don’t want to pay for an entirely new PC.

Hopefully, Newegg will offer some clarity on exactly how a product qualifies as a “hot item” and the impact this has on traditional store availability. This quote: “Hot items will not be the same throughout the day, this depends on the market,” seems to say you could be forced to buy a Newegg-assembled PC if the GPU you want is a “hot item” at 9 AM, but free to purchase just the card if you wait until 4 PM. If “hot item” availability varies by the hour, how is forcing customers to pay for a $99 concierge service ever going to work?

It is possible that the practical impact of this policy will be small, especially if Newegg is conservative about what it considers a “hot item.” But whether it kicks in on one model of GPU every other year or 50 products per day, this is unreasonable.

Newegg has demonstrated zero expertise in building and shipping systems in this fashion and its quality control is unimpressive. I once bought a refurbished TV direct from Newegg. It arrived with a damaged screen. Upon questioning, the representative I spoke to acknowledged that “refurbished” hardware sold directly by Newegg wasn’t actually inspected or tested, even when it had been previously returned as bad. The same TV I bought had been previously returned by a different person for exactly the same problem. For all I know, the company went right on selling that television.

Newegg wants to play OEM, but it doesn’t have the support infrastructure to do it. Despite charging $99 to assemble a PC, the company notes that it provides no additional warranty: “Each part used with your assembled PC is covered under its own manufacture warranty.” One reason some people like to buy from OEMs is that they want the safety of a specific warranty for the entire system rather than individual components. In this case, Newegg wants to earn OEM money without providing OEM services (or “services”, in some cases).

Requiring customers to buy a bunch of components they don’t want in order to buy the hardware they do want is literally the OEM model. It’s what DIY PC builders are trying to escape from. The practical impact of Newegg’s hot item policy will vary depending on how strictly the company applies the criteria, but the fact that it exists at all is reason enough to do your shopping elsewhere.

Now Read:

July 7th 2021, 4:10 pm

OnePlus Caught Manipulating Smartphone Performance Benchmarks

ExtremeTech

OnePlus has always prided itself on producing fast phones with the latest silicon. That’s how it made a name for itself: by packing more power into its phones for less money. The company’s current flagship phone, the OnePlus 9 Pro, is fast, but how fast is it, really? It’s a surprisingly difficult question to answer, based on an investigation by Anandtech. Despite having ample power, the OnePlus 9 is aggressively throttling performance for most apps while allowing benchmarks full power. As a result, its phones have been pulled from the Geekbench charts. 

The OnePlus 9 Pro packs a Snapdragon 888 system-on-a-chip (SoC), which is a powerful processor with eight CPU cores. One of those cores, known as the prime core (a Cortex-X1 design), can run at almost 3GHz. That’s pretty amazing for a smartphone processor, and the 888 is very speedy as a result. However, Anandtech noticed some unevenness in its various tests of the phone. It now reports that OnePlus has a “blacklist” of applications that will cause the CPU to throttle the clock speed. Unsurprisingly, benchmarking apps are not on the list. 

This flips the script on most previous benchmark cheating scandals. Usually, the offending OEM has specifically boosted clock speeds on whitelisted apps, which included benchmarks. If the goal is to have metrics that describe what the device is capable of in daily use, this behavior is clearly deceptive. OnePlus went the other way, slowing down almost all popular apps like Twitter, Chrome, and Uber. The benchmarks are still stellar, though. 

Performance difference based on app package name, via Anandtech

As you can see in the graph above, an app that is not on the blacklist gets the full speed of the SoC. If the app masquerades as Twitter, the clock speed drops considerably. When it pretends to be Chrome, the clock drops even more. The workloads are often pushed off to the slower CPU cores, bypassing the X1 altogether. System performance should change based on conditions not on which app you’re running. 

Anandtech speculates that OnePlus implemented this ham-handed throttling regime to boost battery life, which is a place the OnePlus 9 Pro really struggles. Although, it’s not even clear this will help. The prime core is intended to respond to transient, high-performance workloads to make the system more efficient. However, it can’t do that here. 

After catching wind of this, Geekbench has opted to remove the OnePlus 9 Pro from its popular benchmark charts. The developers have pledged to investigate and see if other OnePlus phones are doing similarly shady things. If this proves to be more than an isolated incident, OnePlus could be in trouble.

Now read:

July 7th 2021, 2:56 pm

Asrock Unveils New List of Windows 11-Compatible Motherboards

ExtremeTech

Asrock has published a list of motherboards it believes might be compatible with Windows 11 when that OS launches later this year. Unfortunately, we’ve got to frame the list that way because the manufacturer itself isn’t sure what’s going to happen with every product. The list, however, may shed some light on how Microsoft’s thinking is evolving.

First, here’s the expected compatibility as Asus sees it. Intel PPT and AMD fTPM support are analogous for the purpose of this discussion:

We’d normally break Threadripper out into its own support column, but Asrock elected to include it here with the 300-series chipsets. This news is pretty decent, if true. It implies all Ryzen-era chipsets are compatible with Windows 11. The “actual level of support [will be] based on [an] official release of Windows 11 by Microsoft,” however, points to how uncertain this information still is. What Asrock is saying here, between the lines, is “We can support it if Microsoft allows it.”

Microsoft has not updated its CPU compatibility lists since the 24th of June, so we don’t have any news to report on that front. Asrock’s new notice does have some basic information in it for how you can find the TPM / PPT settings on your motherboard.

Intel and AMD label things slightly differently. Even if you don’t have an Asrock board, these images may give you some idea of what to look for.

Right now, the big question is which CPUs and motherboards will or will not be allowed to upgrade to Windows 11. We’re still waiting to find out the verdict on certain systems based on 7th Gen Intel CPUs or first-gen Ryzen. This is where support could still get a little wonky. If Intel decides to allow 7th Gen CPUs on Z270, does that mean 6th Gen will be locked out? Or will the company simply stick with 8th Gen and the 300-series of motherboards?

Right now, we recommend waiting and seeing as far as Windows 11 is concerned. Hardware prices are still high enough that it’s best to hold off building a new system anyway, and Microsoft will have and hopefully final guidance in the weeks to come. As for which motherboards today are compatible with Windows 11, that’s easy: Any system you build around an 8th to 11th Gen Core CPU should be fine. AMD’s Zen 2 and Zen 3 architectures also appear to be fine. One caveat: Anyone intending to drop the OS on a small SSD should be wary of the increased storage requirements. Windows 10 called for 20GB of minimum storage for the 64-bit version, but Windows 11 moves up to 64GB.

Asrock is not the only company that has published guides like this. Asus, Biostar, Gigabyte, and MSI have all published guides to various products of their own.

Now Read:

July 7th 2021, 10:41 am

Bitcoin and Ethereum Both Show Signs of Cooling Off

ExtremeTech

The evidence that demand for cryptocurrency is finally cooling off isn’t just coming from reports on GPU MSRPs. There’s direct evidence from recent activities on the Bitcoin and Ethereum mining networks as well. China’s recent crackdowns have hit both networks hard and the total amount of mining happening globally is dropping as a result.

Recent data from Etherscan, via PCGamer, shows how steep the drop-off has been since China announced a ban on payment companies from providing services related to cryptocurrency transactions. The initial ramp over the past year isn’t surprising given how hard it has been to buy a graphics card. The sudden fall-off isn’t exactly unprecedented — we’ve seen crypto bubbles collapse before — but it definitely recalls the way such surges tend to end.

The news from the BTC side of things is also interesting. According to PCG, Bitcoin difficulty has been adjusted based on the decrease in mining horsepower. It’s now 28 percent easier to mine a block than it was before. Direct Bitcoin mining is almost entirely handled by ASICs now instead of GPUs, but that’s actually part of what makes this telling. The entire market is cooling off, used GPUs are said to be appearing for prices near MSRP. Overall, the GPU market appears to be moving back towards something like normalcy.

China appears to be quite serious about cracking down on firms that engage in any crypto-related activity. Reuters reports that the Beijing office of China’s central bank, the People’s Bank of China, ordered Beijing Qudao Cultural Development Co Ltd to suspend operations as a result of suspicions that the company had engaged in cryptocurrency trading. BTCChina, which held the distinction of being China’s first Bitcoin exchange, has since ceased all cryptocurrency-related operations. 75 percent of all BTC mining reportedly takes place in China, so the nation’s decision to crack down is definitely having an impact on the wider market.

The big benefit for consumers, of course, will be the increased availability of GPUs. It may still take 6-12 months for backlogs to clear, however. Most of the PC gamers who would have bought Ampere cards have been stuck not doing so thanks to bad pricing. We’re probably looking at some months of elevated gaming-related demand after prices fall low enough to trigger increased gamer demand in the first place.

Personally, I can’t wait. I’ve never had an intrinsic problem with cryptocurrencies, but the way they’ve warped the PC gaming market for the past five years is anything but great. As of July 20, the GPU market will have been overheated for half the time (~31 out of 62 months) since Pascal launched. One doesn’t have to blame any specific company to greatly dislike the situation, and the high price of GPU upgrades has been a deterrent to anyone hoping to upgrade during the pandemic. Lower prices can’t come quickly enough.

Now Read:

July 7th 2021, 9:39 am

Ingenuity Mars Helicopter Completes Ninth and Toughest Flight

ExtremeTech

The Ingenuity helicopter is purpose-built to make history, so it should come as no surprise that the plucky little drone has just completed its ninth flight on Mars, and this was the most impressive one yet. Ingenuity flew higher, faster, and farther than it has before, and it’s still raring to go for another trip. 

The Ingenuity team released some details of what they intended for the ninth flight back on July 2. The flight took place on July 5th. While we don’t have full details on the helicopter’s activities, we know the JPL sees the flight as yet another smashing success. We also have yet another snapshot of the drone’s shadow on the surface. Ingenuity only has downward-facing cameras because it was intended as a technology demo mission only. 

According to a tweet posted yesterday after the operation ended, Ingenuity spent 166.4 seconds in the air, and it reached a speed of 5 m/s. That might not sound very fast for a helicopter, but this is a helicopter on another planet. Rovers like Perseverance and its older sibling Curiosity can only crawl along a few meters at a time because operators here on Earth need to make sure it doesn’t roll over something dangerous. Ingenuity has the power to get where it’s going fast. 

NASA announced several weeks back that it was extending Ingenuity’s technology demonstration mission, which was only supposed to last a month. The new “operations demonstration” phase has seen the helicopter begin making one-way flights, covering more distance, and moving quicker. During a previous flight, a computer glitch caused some wobbling and a temporary loss of location tracking, but the robot’s generous error margins helped it set down safely. During this flight, Ingenuity passed over a sandy area known as Séítah. This was a challenge for the robot as its navigation algorithms were designed for flat rocky terrain, not rolling dunes. It did not suffer any issues as it did previously. 

As successful as Ingenuity is, it probably won’t live much longer on the red planet. NASA designed the helicopter to be light and efficient with the help of off-the-shelf components, such as a Qualcomm Snapdragon 801 smartphone processor. However, these parts are not hardened against the harsh conditions on Mars. It’s unlikely the tiny solar-powered helicopter will survive the upcoming Martian winter. However, it’s already made history, and its descendants will no doubt do even more. 

Now read:

July 7th 2021, 8:42 am

Stephen Hawking’s Black Hole Theorem Confirmed by Gravitational Waves

ExtremeTech

Black holes can be perplexing and counterintuitive because they exist at the very edge of our current understanding of physics. Even as they’re warping the fabric of the universe, black holes have to obey certain rules. For the first time, we have direct confirmation of a vitally important property of these dead stars. Researchers from MIT and other institutions have confirmed Stephen Hawking’s black hole area theorem 50 years after the late scientist proposed it. 

In 1971, the same year that astronomers spotted the first likely black hole in Cygnus X-1, Stephen Hawking devised his area theorem. According to the theorem, the area of a black hole’s event horizon can never get smaller. Hawking was able to show this worked mathematically, but many were skeptical of the apparent parallels with the second law of thermodynamics, which says entropy (or disorder) within an object will never decrease. 

The area theorem could therefore mean black holes can emit thermal energy, which is very much the opposite of what you’d expect for a gaping gravitational maw that consumed anything that ventured too close. It wasn’t until a few years later that Hawking brought the two ideas together when he proposed Hawking radiation, which says that black holes could emit tiny amounts of radiation as a consequence of their quantum properties. This could have major implications in physics, and it all flowed from the idea that event horizons can’t shrink, but we didn’t have any experimental confirmation until now. 

The Laser Interferometer Gravitational-wave Observatory (LIGO) began scanning the heavens for signs of gravitational waves in 2015. The first signal it spotted was GW150914, a black hole that used to be two black holes. When LIGO came online, the team didn’t have the ability to tease out event horizon measurements, but that has changed over the last several years. 

The M87 supermassive black hole imaged in 2019. GW150914 is much smaller, but this is the only real picture of a black hole we have.

The researchers, led by MIT fellow Maximiliano Isi, revisited that first signal to see if they could prove the area theorem. If Hawking was right, the black hole that was left after the collision should have an event horizon that is at least as big as the total area of both precursor black holes. 

Using new techniques, Isi’s team was able to pick out frequencies from the immediate aftermath of the collision that they could use to calculate the final object’s mass and spin. A black hole’s mass and spin are directly related to the event horizon area, so the team was able to measure GW150914 before, during, and after the peak in gravity waves. Before the merger, the two black holes had a combined event horizon area of 235,000 square kilometers. After the collision, the new, larger black hole 367,000 square kilometers.

With a larger event horizon, GW150914 confirms the Hawking area theorem. It’s an impressive extrapolation from gravity waves, and LIGO is only our first attempt at analyzing this phenomenon.

Now read:

July 6th 2021, 7:26 pm

GPU Prices May Finally Be Headed in the Right Direction

ExtremeTech

Nine months after the latest cryptocurrency boom kicked off in earnest, GPU prices may finally be headed in the right direction. A new report from 3DCenter charts GPU prices in Germany and Austria over the past few months and finds evidence that new cards are now selling for much less above MSRP than they had previously. Availability has also improved in those two markets.

Image by 3DCenter.de

We aren’t seeing any signs of price drops in the US yet, so you don’t need to start hitting refresh on Newegg. But the decline in elevated prices is clearly significant. There are a few reasons why it’s happening now. First, Nvidia’s mining cards have reportedly sold fairly well, earning $155 million from February to May, likely absorbing some amount of pressure that would have otherwise fallen on the GPU market. Second, the cryptocurrency market has slumped following new waves of regulations in China.

As of last year, 65 percent of all BTC mining was thought to occur in China, which gives some perspective on just how large the Chinese cryptocurrency market is. The market is now being flooded with cheap, used cards. This will put downward pressure on prices for new cards, but caveat emptor is very much in effect.

Mining on consumer GPUs can shorten their half-life. Consumer cards are not tested or anticipated to run 24/7, and the mining rigs employed by a lot of people don’t offer high-end water-cooling or any equivalent that could help offset the potential for long-term heat damage. Buying a used consumer card that’s been used for gaming is normally pretty safe, so long as you can trust the buyer not to have overclocked or mistreated the card. Buying a card from a mining rig is not as safe and we don’t recommend it.

If you are thinking about buying a GPU despite the current high prices, our advice is to hold off if you possibly can. If prices are starting to drop around the world, we should see similar decreases in the United States. We can’t give you an exact time frame, obviously, but 1-4 weeks seems reasonable.

If GPU prices do drop, it’ll go a long way to restoring some sanity in the DIY PC market. At the beginning of the year, it was also hard to find AMD and sometimes even Intel CPUs for MSRP, though Intel chips were never as scarce as their AMD equivalents. Prices and availability on the CPU side of things have improved in the last seven months, and you can now find most of the other hardware you’d be looking to buy other than a GPU at reasonable prices.

Just because GPU prices are moving back towards MSRP does not automatically mean they’ll reach it any time soon. It could take a year or more for prices to recover. Much depends on what plans Nvidia and AMD have laid for GPU manufacturing in the next six months and how demand evolves across the globe.

Now Read:

July 6th 2021, 7:26 pm

ET Deals: Over $400 off Dell Alienware M15 R3 Nvidia RTX 2070 4K OLED Gaming Laptop, Apple iPad Mini

ExtremeTech

Today you can save over $400 on a new Dell Alienware M15 R3 that ships with a 4K OLED display and has all the hardware you need to run games with blistering speed and with high quality graphics settings.

Dell Alienware M15 R3 Intel Core i7-10750H 15.6-Inch 4K OLED Gaming Laptop w/ Nvidia GeForce RTX 2070, 16GB DDR4 RAM and 2x256GB M.2 PCI-E SSDs in RAID-0 ($1,869.99)

If you want a fast notebook with plenty of performance for running the latest games, you may want to consider Dell’s Alienware M15 R3. This system was literally built for gaming and it features a fast six-core processor, an Nvidia GeForce RTX 2070 GPU, and a high-quality 4K OLED display that has excellent color accuracy and Tobii eye-tracking technology built-in. The system also has two 256GB NVMe SSDs in RAID-0, which gives you plenty of space for everyday use and faster transfer speeds. You can get this system from Dell marked down from $2,279.99 to just $1,869.99.

Apple iPad Mini 64GB 7.9-Inch Wi-Fi Tablet ($344.99)

If you’re looking for a small tablet for travel, Apple’s compact iPad Mini may be the right choice for you. This tablet measures just 7.9 inches diagonally, and it has plenty of performance for gaming and multitasking courtesy of the A12 Bionic processor embedded inside. Usually priced at $399.00, you can get this tablet now for $344.99 from Amazon. Final price is shown at checkout.

Dell S2721QS 27-Inch 4K IPS Monitor ($299.99)

Dell’s S2721QS is a 27-inch monitor that sports a 4K IPS panel with HDR and FreeSync support. The monitor can also be used for detailed video editing work as it covers 99 percent of the sRGB color gamut, and it also has a built-in pair of 3W speakers. Currently Dell is selling these monitors marked down from $489.99 to $299.99.

Western Digital My Passport 1TB External SSD ($147.49)

This external drive utilizes SSD technology to enable relatively fast data transfer speeds of up to 1,050MB/s over a USB 3.2 Gen 2 connection. It can also hold up to 1TB of data and it has built-in security features including the ability to set a password on the drive and 256-bit AES hardware encryption. The drive also has a durable metal enclosure as well as a small physical footprint and can be picked up from Amazon marked down from $199.99 to just $147.49.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 6th 2021, 7:26 pm

Nintendo Announces Updated Switch With OLED Screen

ExtremeTech

Rumors of a new, high-end Nintendo Switch model have been swirling almost since the moment the original was released in 2017. In 2019, we got the Switch Lite, but the “Pro” redesign was nowhere to be seen. Nintendo has finally taken the wraps off the improved Switch, and it’s a bit less “pro” than we expected. The updated Switch has a better dock, improved audio, and a bigger OLED screen to replace the LCD of the original. It’s mostly the same on the inside, though.

If you were hoping for 4K Nintendo gaming, you’re going to have to wait for another console generation. All the rumors and speculation about AI upscaling were for naught as the Switch OLED uses the same Tegra X1 chip with support for 720p in handheld mode and 1080p in docked mode. The console doesn’t even have a new name—it’s still just the Nintendo Switch, but they tacked on “OLED model” in parentheses to keep things straight. It does come with 64GB of internal storage 

The original Switch had a 6.2-inch 720p LCD screen, and the new one steps up to a 7-inch 720p OLED panel. The increase in size without bumping the resolution could make the display look less sharp, especially if the OLED uses pen-tile subpixel arrangements like many smartphones do. The overall design of the device hasn’t changed much. It still has the removable Joy-Cons, and they look unchanged. That means you should be able to use the ones you already have interchangeably with the new console. It would be nice if Nintendo fixed the drifting issues in the controllers that come with the new console, though. Please?

Using the OLED Switch in tabletop mode will be much less aggravating. Not only do you get better speakers on the device, but the kickstand has also evolved from a tiny piece of breakable plastic to a 2-in-1-laptop-style hinge that spans the entire back. The full-width kickstand is adjustable so you can find the most comfortable angle to play. The last notable change is not in the Switch itself but in the dock. In addition to the USB ports and HDMI, the new dock has an Ethernet port for wired connectivity. 

If you can’t wait to upgrade your Switch experience, the new console can be yours on October 8th of this year. It will retail for $350, and if recent history has taught us anything, it will probably be hard to get one for MSRP.

Now read:

July 6th 2021, 7:26 pm

Server Supplies Tighten Thanks to Silicon Shortages, Tight CPU Supply

ExtremeTech

When Intel hit a rough patch a few years back and couldn’t supply enough CPUs to meet demand, the company chose to take a market share hit in the low end of the market to protect its data center business as much as possible. When AMD was capacity constrained earlier this year, the company announced it would do the same thing. It makes sense for both companies to prioritize the shipments of their highest-end, most valuable hardware.

Unfortunately, it seems such prioritization is no longer sufficient to meet demand. According to DigiTimes via THG, lead times are worsening and have risen from 52 to as high as 70 weeks. That’s very high, even given the long production cycles required to build chips. CPU supplies from AMD and Intel are also said to be quite tight, but it’s not clear if the bottleneck is in CPUs, low-level server ICs, or both. There are references to both issues, but the low-level server IC problem is getting most of the attention. Other companies such as Mitac are supposedly unable to fill 20-30 percent of orders due to a shortage in server ICs.

Image via Seeking Alpha

The semiconductor shortage continues to ricochet across the industry. Auto sales fell badly in June 2021 compared with June 2019, down 14.2 percent. None of this has prevented auto manufacturers from continuing to turn a profit — like AMD and Intel, auto companies are prioritizing the highest-end vehicles and components they have. Buyers who need a car (or a computer) are willing to pay top dollar.

One underreported facet of the semiconductor shortage is how much money all of the companies involved are making. Nvidia, AMD, Intel, and Apple have brought in a great deal of revenue through these shortages and will continue to do so. Nothing wrong with manufacturing an in-demand product, and the chip manufacturers aren’t the ones making money on GPU shortages, but all of the companies in question are doing very well. This applies to car manufacturers as well, at least to some extent.

Overall, the pandemic has been a brutal lesson in the perils of just-in-time manufacturing. Toyota’s sales have grown substantially in part because it requires suppliers to maintain chips and other components in reserve and pays them to do so. A recent Time story on the semiconductor shortage mostly retreads ground ET readers will find familiar, but it includes some interesting information on how GlobalFoundries has changed its own manufacturing plans to prioritize auto manufacturers during the shortage. There’s been a great deal of behind-the-scenes negotiating and order-shifting already as companies have tried to adjust to the unusual situation.

Tight server supplies for AMD could blunt the company’s progress in taking market share in that space. In Q1, AMD traded market share in desktop and laptops for servers and did quite well in that market. We won’t have Q2 data for a little while yet, but it’ll be interesting to see if the launch of Milan (AMD) and Ice Lake SP (Intel) shifted market share between the two companies.

Top image credit: TSMC

Now Read:

July 6th 2021, 7:26 pm

These 25 VPNs, Apps, and Software Are All On Sale Right Now

ExtremeTech

Using essential software is a fact of life and work, but it’s not always easy to shell out full price for everything you need. These 25 offers on apps and software can help you keep up to date while making a great saving.

Surfshark VPN: 3-Yr Subscription, on sale for $67.20 when you use coupon SUMMER20

Keep your family and your data safe online with this popular virtual private network (VPN) that can keep your information secure and mask your location. Two-year subscriptions to SurfShark VPN are also available for $45.60 when you use the coupon SUMMER20 at checkout.

Tweet Ninja Twitter Automation Solo Plan: Lifetime Subscription, on sale for $39.20 when you use coupon SUMMER20

Automate your Twitter with this useful tool that makes the social app much more hands off. It can retweet or follow relevant accounts for you, ensuring your account is always growing and never idle.

Scopio Authentic Stock Photography: Standard Lifetime Subscription, on sale for $29 (99 percent off)

If you make any kind of creative project, you’ll know how difficult it is to source striking, original imagery. Scopio is a royalty-free image library containing more than 400,000 images for your personal or commercial use.

Mashvisor: Lifetime Subscription, on sale for $39.99 (97 percent off)

Mashvisor analyses real estate data from across the country to help you make more informed investment decisions. Find investment properties and optimize their performance with this innovative software.

Beelinguapp Language Learning App: Lifetime Subscription, on sale for $39.99 (60 percent off)

Learn and practice 14 included languages using audiobooks, and refine your listening and language skills. Simply sign in with your Google or Facebook account and you’ll be able to read along to books as a native speaker narrates.

LingvaNex Translator: Lifetime Subscription (Desktop and Mobile Bundle), on sale for $79.99 (80 percent off)

This easy translation app works on text, voice, images, websites, and documents, and features more than 112 languages. It works on Windows, macOS, iOS, and Android, and also has a phrasebook and memory cards feature to help you learn and practice your language(s) of choice.

BetterMe Home Workout & Diet: Lifetime Subscription, on sale for $39.99 (96 percent off)

Access enough workouts for a lifetime with BetterMe, featuring targeted workout and meal plans to help you achieve your body goals. It also includes a built-in community with daily articles, FAQs, and a step counter to keep you on track.

Degoo Premium: Lifetime 10TB Backup Plan, on sale for $89.40 when you use coupon SUMMER40

Keep all your files easily backed up and never worry about losing data. This subscription allows you to backup 10TB of data under ultra-secure encryption – more space than Dropbox, OneDrive, and Google Drive combined.

Hushed Private Phone Line: Lifetime Subscription, on sale for $16 when you use coupon SUMMER20

Keep your real phone number hidden while you make calls or texts with this app and phone plan. It includes a second phone number (you can choose the area code) and 6,000 texts or 1,000 phone minutes per year.

iMazing iOS Device Manager, on sale for $16 when you use coupon SUMMER20

More easily manage the data stored on your Apple devices including iPhone, iPad, and iPod devices, and more easily copy data between them and your computer. The iMazing iOS Device Manager is also available for three devices at $20 or five devices at $24 when you use coupon SUMMER20 on either subscription.

CuriosityStream HD Plan: Lifetime Subscription, on sale for $159 (36 percent off)

Get access to thousands of the best documentaries from around the world with this unique streaming service. See productions by respected experts like David Attenborough, Michio Kaku, and Brian Greene, many of which are in HD and 4K.

Award-Winning, Exclusives & Originals — Enjoy Unlimited Access to Thousands of the Planet’s Best Documentaries

MyDraw Advanced Diagramming Software: Lifetime License, on sale for $19.99 (71 percent off)

Create attractive and attention-grabbing charts, graphs, diagrams, models, floor plans, maps, and more with this innovative program. It can connect to Excel, and a range of templates to get you used to the program.

RoboKiller Spam Call & Text Blocker: 1-Year Subscription, on sale for $29.99 (24 percent off)

No one likes spam calls, so get RoboKiller to block them for you. It can prevent 99 percent of them from getting through to you and also features a block/allow list and even SMS spam protection.

Camtasia 2021 + One Year of Maintenance, on sale for $199 (33 percent off)

More than 34 million users around the world create videos with Camtasia. With this popular software you can screen record, work with included video templates, add effects and transition, and even choose from included royalty-free audio and sound effects.

Lightkey Pro Text Prediction Software: Lifetime Subscription, on sale for $49.99 (70 percent off)

Type quicker and more accurately with this AI-powered text prediction software. It can also correct your spelling and grammar for more professional writing output.

EasySplitter Pro Vocal Remover: Lifetime Subscription, on sale for $39.99 (93 percent off)

Use this innovative technology to upload tracks and break them down into their parts. It will allow you to remove vocals to create easy backing tracks or even split off instruments, bass, and percussion.

Matt’s Flights Premium Plan: 1-Yr Subscription, on sale for $29.99 (69 percent off)

Now that we can contemplate travel again, it’s time to book some flights. With Matt’s help you’ll be able to find the best deals which means more money for travel.

Setapp: 1-Year Subscription, on sale for $55.20 when you use coupon SUMMER20

Streamline your productivity with access to more than 210 curated apps especially for Mac. The library contains lifestyle, productivity, task manager, creativity, finance, developer tool apps, and more.

Blinkist Premium: 2-Year Subscription, on sale for $80 when you use coupon BLINK20

Blinkist breaks down complex books into digestible 15-minute (or less) summaries. Fit reading into your day again with interesting and compelling audio and text explainers.

The Bestselling ProWritingAid Lifetime Subscription Bundle, on sale for $160 when you use coupon SUMMER20

Improve your writing style and quality with ProWritingAid – it’s a multi-function writing support, including a grammar checker, style editor, and writing mentor. In this bundle, you’ll also get access to ProWritingAid academy for self-paced video courses that will also help you improve.

The World Traveler Bundle ft. Rosetta Stone Lifetime Subscription, on sale for $159.20 when you use coupon TRAVEL20

This bundle includes three great services that’ll help you travel further, cheaper, and better. The bundle includes a lifetime subscription to Rosetta Stone for all your language needs, a three-year subscription to Matt’s Flights for discount airline deals, and the “Travel Hacker Bundle,” which can teach you to save money, become a digital nomad, and travel for less.

Tello Economy Prepaid 12-Month Plan: Unlimited Talk/Text + 1GB LTE Data + Free SIM, on sale for $79 (34 percent off)

Get a 12-month prepaid plan with Tello for a reduced rate, no contract, and no extra fees. You’ll get unlimited talk and text plus 1GB of LTE data per month, for the next 12 months.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 6th 2021, 7:26 pm

Why Is Microsoft Launching Windows 11 Now?

ExtremeTech

An interesting facet of Windows 11 that we haven’t touched on yet at ExtremeTech is that from the consumer perspective, this isn’t the best time for Microsoft to launch a new OS. The relationship between Microsoft OS launches and upswings in consumer hardware adoption appears to be small, especially in the past decade. But we’re currently in the middle of an unprecedented silicon shortage. Any additional demand sparked by Windows 11 will stress the market more than an equivalent uptick during more normal times.

From Microsoft’s perspective, this may not be a problem at all. It may actually be the reason the company is launching the OS when it is in the first place.

The Windows Migration Problem

Microsoft has struggled to move users off of older versions of Windows for at least the past 15 years. In the beginning, the rapid pace of improvement in hardware and software made buying new versions of previous products a pretty easy sell. Whether you liked Windows or hated it, there was no arguing that Windows 95 was radically different than Windows 3.1. When people upgraded to XP off Win 9x, they often did so to improve basic functionality and system stability.

Today, people keep computers for far longer than they once did and are therefore less likely to acquire a new OS upon system purchase. Microsoft’s first solution to this, deployed with Windows 10, was to make upgrading the new OS free and easy for pretty much anyone, even users with old machines, and to relentlessly nag, gaslight, and needle anyone who didn’t jump on board. Objectively, the strategy worked pretty well, even if we detested the nagware side of it. Six years after launch, the vast majority of the Windows world is running on Windows 10.

But the past six years have also seen a considerable evolution in security threats. Ransomware is now a major problem across many different industries. We’ve seen the rise of more sophisticated malware campaigns and better infiltration software. Windows 11’s security requirements go beyond TPM 2.0 — Microsoft is still deciding exactly what they are — but the company is very serious about requiring stronger security standards at the hardware level.

Of Processors and Pandemics

In the past, Microsoft could depend on the x86 manufacturers to introduce faster CPUs on a regular cadence. It’s difficult to explain (or remember) just how fast this cadence actually was. In late 1995 and early 1996, the fastest CPU you could buy was either a Pentium or Pentium Pro 166. Six years later, Intel was knocking on the door of 2GHz with the Northwood P4. While it’s true that even the Northwood P4 was less efficient than the P3, it would still have been more efficient than the original Pentium. Knock 400MHz off the comparison to be rude, and that’s still an 8.43x clock improvement in a bit over six years, not counting innovations like a full-speed on-die cache or the then-ongoing adoption of performance-boosting SIMD instructions via SSE2.

Improvements come much more slowly now, and PCs now live much longer. The wear-and-tear laptops inevitably suffer ensures they’ll always be replaced more frequently than desktops, but PC desktop replacement cycles have moved from 2-3 years to five years or more. I once heard a company rep confidently refer to “the four-year PC replacement cycle.” Twelve months later, when being briefed by the same company on its new products, the representative mentioned “the five-year PC replacement cycle.”

The story of the PC market from 2010-2020 is almost entirely a negative one, as far as hardware sales are concerned. Ultrabooks may have helped raise PC average selling prices, and they definitely offer a more up-market PC experience than was generally available 10 years ago, but PC sales declined year on year for most of a decade, from a high of 365 million units in 2011 to just 263 million units in 2019. In 2020, thanks to the pandemic, PC sales grew to 275 million units without Chromebooks, or approximately 302 million units if Chromebooks are included. But either way, sales of Windows PCs grew for the first time in years.

PC shipments, according to Canalys. These figures appear to include Chromebooks.

Right now, the PC market is expected to remain strong through at least the end of the year and possibly into 2022. Any Windows upgrade cycle that Microsoft launches now risks exacerbating demand issues. But the fact that people are upgrading now also represents an opportunity. The computers people are upgrading to, generally speaking, support more advanced security standards than machines bought from 2010-2016.

If Microsoft wants to push the market forward and adopt new security standards, kicking a new version of Windows out the door is probably the best way to do that. In doing so, Microsoft is returning to an older tactic. It could have done what Apple does and increment OS versions without shipping an entirely new product, but the Windows developer has historically used new versions to mark major changes in hardware support.

Launching Windows 11 in 2021 allows Microsoft to take advantage of the fact that PC buyers are upgrading from old machines at a higher rate. It gives the company the best chance of baking in certain security features as an expected baseline on the widest range of PCs. In its messaging, Microsoft has emphasized that Windows 10 will be supported through 2025, and we suspect that’s partly to take the sting out of this shift.

Image by Microsoft

Our guess is that Microsoft hopes to use the unexpected surge in PC demand to drive a quicker shift towards better security standards than it could otherwise have achieved, and is launching Windows 11 relatively soon for that reason. This also dovetails with Microsoft’s emphasis on improving security in hardware, through avenues like its Pluton security processor.

For consumers, Windows 11 doesn’t seem to pack much in the “killer feature” department, though the new icons are nice and the improvement to hybrid computing could also be useful. From Microsoft’s perspective, however, there may not be a better time. The PC market is currently enjoying its first bona fide demand boom in a decade, and it took a global pandemic to deliver it. It’s a bad idea to bet on that kind of lightning striking twice.

The answer to the question we raise in the title, we suspect, is: “Because it doesn’t think it has much choice.” If you want to drive new security standards into the market, taking advantage of a major PC buying boom is the best way to do that.

Now Read:

July 6th 2021, 7:26 pm

ET Independence Day Deals: Up to 47 Percent Off Dell Alienware Gaming PCs and Monitors

ExtremeTech

Celebrate this Independence Day with discounts on new high-end devices. Dell has marked many of its high-end Alienware products on sale including several gaming laptops, desktops and gaming monitors. Get yourself a new high-end gaming machine to help make this Independence Day a memorable one.

Dell Alienware M15 R5 Ryzen Edition AMD Ryzen R7 5800H 15.6-Inch 1080p 165Hz Gaming Laptop w/ Nvidia GeForce RTX 3060 GPU, 16GB DDR4 RAM and 512GB PCIe SSD ($1,299.99)

The new Dell Alienware M15 R5 features an updated thermal solution that helps you to get the most out of your hardware. The system has powerful components including an AMD Ryzen R7 5800H octa-core processor and an Nvidia GeForce RTX 3060 graphics chip that can run games fluidly on the notebooks 165Hz 1080p display. It also RGB LED keyboard, 16GB of RAM and a 512GB PCI-E SSD. This new system hasn’t been out long, but you can get one now from Dell marked down from $1,649.99 to just $1,299.99.

Dell G15 Intel Core i7-108750H 1080p 15.6-Inch Gaming Laptop w/ Nvidia GeForce RTX 3060, 16GB DDR4 RAM and 512GB NVMe SSD ($1,149.99)

Dell’s new G15 gaming laptop has an edgy aesthetic design and comes equipped with powerful hardware for running the latest games. This particular model is equipped with an Intel Core i7-10870H processor as well as an Nvidia GeForce RTX 3060 graphics chip that can run games with high settings.  The notebook also has a 120Hz display that helps to make gaming on the system all that much more enjoyable by making games feel faster and more responsive. Today you can get this system from Dell marked down from $1,399.99 to just $1,149.99.

Dell Alienware Aurora Ryzen Edition AMD Ryzen 7 5800 Gaming Desktop w/ Nvidia GeForce RTX 2060 Super GPU, 32GB DDR4 RAM and 1TB NVMe SSD ($1,499.99)

This Alienware desktop features an edgy, rounded design and powerful gaming hardware capable of running most current AAA titles with maxed out graphics settings. In addition to looking cool, this system was also designed to provide improved airflow over the older Aurora desktops, which means the hardware inside will also run cooler as well. For a limited time you can get this system from Dell marked down from $1,869.99 to $1,499.99.

Dell Alienware AW2521HFL 240Hz 1080p 24.5-Inch Gaming Monitor ($269.99)

Enjoy your games to the fullest with a blazing 240Hz monitor! In addition to its extreme refresh rate, this display features support for both FreeSync and G-Sync and a fast 1ms response time giving you a highly responsive gaming experience. Right now you can get this display from Dell marked down from $509.99 to $269.99.

Featured Deals:

Dell Gaming Laptops

Dell Gaming Desktops

Dell Gaming Monitors and Office Displays

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

July 1st 2021, 6:48 pm
Get it on Google Play تحميل تطبيق نبأ للآندرويد مجانا