Get it on Google Play تحميل تطبيق نبأ للآندرويد مجانا

ET Deals: $30 off Echo Show 8 Out Tomorrow, OptiPlex 3070 Desktops Under $600, Disney+ with Hulu and

ExtremeTech

Amazon’s new Echo Show 8 is set to launch tomorrow, but you can pre-order it now and save $30.

Amazon Echo Show 8 ($99.99)

Amazon’s Echo Show 8 features an 8-inch HD display and is compatible with a wide range of Amazon- and Alexa-enabled services. It can work as a display for home security devices like Ring’s video doorbell, and it can be used for calling people and numerous other functions. The Echo Show 8 officially launches on November 21, but you can pre-order it now and get it marked down from $129.99 to $99.99.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

 

November 20th 2019, 5:48 pm

AMD Promises New Architecture for Zen 3, Adopts Intel Tick-Tock Model

ExtremeTech

Supercomputing 19 (SC’19) has been in full swing this week, and AMD has made a number of high profile announcements at the event. There are new deals for Epyc with the EU and the San Diego Supercomputer Center, Amazon is planning to deliver Rome-based cloud computing instances, a pair of Microsoft Azure instances intended for HPC workloads are now available in preview mode, and a new version of ROCm, AMD’s Radeon Open Compute initiative, will roll out soon. The company also notched the first TOP500 win for Rome, using the AMD Epyc 7H12 CPU.

All in all, it’s a successful show for a company that didn’t have much to show at all just a few short years ago — but it’s comments by Forrest Norrod, GM of AMD’s Datacenter and Embedded Solutions Business Group, that catch the eye. According to Norrod, AMD’s simultaneous efforts in CPU and GPU will continue to grow in 2020. AMD has plans to support high-speed CPU-GPU pairings via Infinity Link in future server chips, and it wants to support other standards, like the Intel-backed CXL, as well. According to The Street, Zen 3 won’t be an extension of the previous Zen architecture like Zen 2 was.

The framing of the quote is a bit unclear, however. The Street writes: “Norrod observed that — unlike Zen 2, which was more of an evolution of the Zen microarchitecture that powers first-gen Epyc CPUs — Zen 3 will be based on a completely new architecture.” In the very next paragraph, however, Norrod is quoted as saying that Zen 3 will deliver performance gains “right in line with what you would expect from an entirely new architecture.”

Epyc 2 improvements

AMD, according to Norrod, is confident in its ability to drive “significant IPC gains” each generation. TheStreet reports that AMD plans to rely on the tick-tock cadence that Intel popularized for a decade, in which a “tick” represents deploying an existing architecture on a new process, while a ‘tock’ refers to a new architecture on an existing process. Rome is a tick, despite the improvements from Zen to Zen 2, Milan will be a tock, and Genoa will be a tick again.

There are some oddities in this framing. The first part of the paraphrased quote implies that Zen 3 is a new architecture, while the second statement — which does include a direct quote — states that Zen 3 performance improvements are right in line with what you would expect from a new architecture. My take on this is that the word “architecture” is playing two roles here. There are major architectural shifts, like Bulldozer-Zen, and smaller updates that may improve on a CPU’s underlying performance, but don’t represent a fundamental change to how it operates. AMD’s tick-tock cadence also seems to be a bit different from Intel’s, in that it may include more room for actual evolution of the CPU architecture, even during “ticks,” given how much Rome improved on Naples while shifting to 7nm from 14nm.

Companies don’t actually deploy all-new architectures very often. Intel’s Ice Lake is an evolution of Skylake, which itself can be traced back to either Nehalem (if you date from Intel’s adoption of features like an integrated memory controller) or to Sandy Bridge (if you want to trace lineage using features like an opcache). Of course, in between these major releases, there are plenty of evolutionary changes that also get described as architectural updates. There are good reasons for companies to try and standardize on architectural features — it helps maintain good performance and backward compatibility over the long term.

I suspect that what Norrod is saying here is that AMD’s Zen 3 update will be big enough to qualify as a major architecture overhaul. It may include more significant changes to the underlying Zen design than AMD has made to-date. By the time Zen 3 debuts, AMD will have had Ryzen in-market for over three years, and that’s long enough to begin incorporating ideas the company had after Zen launched into Zen 3’s core design. I seriously doubt the company would jettison the Zen architecture altogether, but it’s possible that there are significant improvements planned.

As for rumors of improved clock on 7nm+, TSMC hasn’t given any guidance pointing to dramatic improvements from EUV, so we’ll just see where that takes us. This is the first time the foundry has ever built big-core x86 CPUs in these TDPs, so we can’t even look to historical performance for a sense of whether AMD will bring clock speeds up.

Each generation of Zen has delivered significant IPC improvements and AMD has been hitting its IPC improvements. Norrod isn’t giving details, but he seems to be laying the stage for further uplift in 2020 on the 7nm+ node. After a year as big as 2019 has been for AMD my expectations for further improvements in 2020 were modest, but AMD seems to think it can continue delivering double-digit gains year-on-year. We haven’t said much about Intel in all this — Ice Lake delivered significant IPC improvements over Skylake (~1.18x), but traded off frequency against its older cousin. Intel’s 10nm is currently only shipping in mobile, so we can’t comment on how future desktop products may evolve.

Now Read:

November 20th 2019, 5:48 pm

Doctors Are Testing Human Suspended Animation for the First Time

ExtremeTech

The stasis/suspended animation chambers, in Alien

Science fiction is full of examples of people entering suspended animation or hibernation during a long space journey, but that might not be fiction for much longer. Doctors at the University of Maryland Medical Center are putting people in suspended animation for the first time. There’s no spaceflight involved, though. Their use case is much more down-to-earth: saving trauma victims in the emergency room. 

The hospital has assembled a team to conduct the suspended animation experiment, but the rules governing the testing are understandably strict. Doctors aren’t just asking people off the street to go into medical hibernation, but people who arrive at the hospital with serious trauma like a gunshot or stab wound might undergo the procedure. 

The process, known as emergency preservation and resuscitation (EPR), involves rapidly cooling the body to about 50 degrees Fahrenheit (10 degrees Celsius) by replacing the patient’s blood with cold saline. At a normal body temperature of roughly 98.6 Fahrenheit (37 Celsius), your cells need a regular supply of oxygen to continue functioning. Five minutes of oxygen deprivation is enough to cause irreparable brain damage, but cooling the body slows or halts metabolic activity in the cells, stops the heart, and preserves tissues. 

Only patients who arrive at the hospital with acute trauma including the loss of half their blood volume and cardiac arrest can undergo the procedure. Under normal conditions, the chances of survival in this situation are less than five percent. The FDA has given the University of Maryland Medical Center special permission to conduct this study because there’s no other viable treatment for these people. With EPR, the surgical team has two hours after the patient’s blood is drained to repair the damage. At that time, the patient’s blood is returned and the heart restarted. 

We don’t know how well EPR works yet, but the team will eventually release a paper detailing the results. The plan is to compare ten patients who underwent EPR with ten who had similar injuries but could not get EPR because the team was not in the hospital at the time. If EPR can boost that five percent survival rate, it could become a new standard treatment for severe trauma. As for hibernating while traveling to other worlds, that’s still science fiction. Cells can experience so-called “reperfusion damage” when reoxygenated, and the damage gets more severe the longer a patient was suspended. It may be possible to develop drugs that limit reperfusion injuries, but first, we have to make sure EPR works on the ground.

Top image credit: Alien

Now read:

November 20th 2019, 3:48 pm

Pirelli Designs 5G ‘Cyber Tire’ That Reports on Road Conditions

ExtremeTech

Carriers around the world are working to roll out 5G networks, and they hope to connect not just phones but also smart home devices, wearables, and cars. Your next car might have its own 5G connection, and tire-maker Pirelli wants to take advantage of that to make roads safer. It has developed a new sensor-studded tire that reports on road conditions and shares that data with other cars over 5G. 

The idea is that a high-tech tire on your car would detect potentially hazardous road conditions, and beam an alert to nearby vehicles. In a demo, Pirelli’s “Cyber Tire” detected a potential hydroplaning risk and transmitted it to cars approaching that spot in the road. This is a type of vehicle-to-everything (V2X) technology, a catch-all term for automotive tech that communicates via wireless infrastructure. Pirelli says the key to making this work is 5G. 

You can get 5G data service in a few places with select phones and mobile hotspots, and the experience isn’t dramatically different than 4G LTE. In fact, it might be worse at certain things. Most uS 5G networks are running on millimeter-wave frequencies, which are many gigahertz higher than 4G. These signals have high bandwidth but poor range. Meanwhile, there are just a few “sub-6” bands available for 5G, like the one in use on Sprint. These are only a little faster than 4G, but they have similar range. 

For carriers, 5G is all about network efficiency. These networks can connect more devices, and they can customize 5G for specific applications. So, you might need very high bandwidth for one type of device, but ultra-low latency might be more important for another. The latter is probably more important for Pirelli’s Cyber Tires. They probably don’t need a lot of bandwidth to transmit location-aware alerts to nearby cars, but timing is key. You would, however, need a lot of smart tires on the road before there’s enough data for it to be useful. 

If this technology appears in cars over the next few years, it could help to alert drivers with annoying flashing dashboard lights or enable select driving features automatically. In the somewhat distant future, this data could feed into self-driving car systems that don’t require human input at all. However, 5G and autonomous driving are still a long way from being ubiquitous. It could be a while before your car steers itself around potholes.

Now read:

November 20th 2019, 12:59 pm

This Is What the PlayStation 5 Controller Might Look Like

ExtremeTech

Sony is planning to launch the PlayStation 5 in time for the holiday season next year, but we know almost nothing about the console. The company has dropped a tidbit here and there, but it’s talked mostly about the controller. Now, we know what that controller might look like thanks to a Japanese patent filing. There are no huge surprises here, but there are some mysteries. 

If you go back a few generations, Sony’s DualShock game controllers used to have a very distinctive shape with longer, tube-like grips that extended down from the button clusters. The DualShock 4 smoothed out that design and flattened it a bit, making the controller closer in shape to the Xbox controller. 

The alleged DualShock 5 for the PS5 looks very similar to the DualShock 4, but it’s a bit wider and more rounded. The DualShock 5 still has the symmetrical thumbsticks, but they protrude less from the body of the controller. The touchpad is still there in the center, but the light bar appears to be gone. There’s also a USB Type-C port on the top for charging (thank goodness). At the bottom of the controller, the patent drawings show a rectangular inlay that wraps around the edge of the controller — we have no idea what this could be for. 

The slightly bulkier frame of the DualShock 5 could be necessary to enable some of Sony’s new features. The company has already explained that it will use a more precise “haptic” vibration system in the DualShock 5 instead of the traditional rumble system that makes the whole controller vibrate. That’ll surely take up more space, as will the new “adaptive triggers.” According to Sony, developers will be able to specify levels of resistance for the L2 and R2 buttons to simulate different activities. 

Nothing is final yet, but we’d put good money on the DualShock 5 looking very much like these drawings. Images and sketches of a purported PS5 console have leaked as well, but these appear to be a developer kit. It’s a V-shaped, vented monstrosity. The final hardware will probably look at least a little more elegant. 

You can expect Sony to show up at E3 this year with a lot more information about its upcoming consoler. Until then, all we can do is speculate.

Now read:

November 20th 2019, 10:24 am

‘Dream Chaser’ Space Plane Features Fancy Space Trash Can

ExtremeTech

All the current launch platforms certified to make supply runs to the International Space Station (ISS) use a parachute to return to Earth after each mission. Although, SpaceX has designs on propulsive landings with its Dragon capsule. Sierra Nevada Corporation has something else in mind with its uncrewed Dream Chaser spacecraft. The Dream Chaser is a fully reusable space plane that can carry a secondary non-reusable cargo module

Sierra Nevada Corporation has talked about the Dream Chaser in the past, but it just revealed new details about the vehicle at a press conference at NASA’s Kennedy Space Center in Florida. The star of the show is the new cargo module, which the company calls “Shooting Star.” That module will nest inside the Dream Chaser during launch and station approach. Then, it can autonomously dock with the ISS and pick up waste material. Yes, it takes out the trash, making it probably the fanciest rubbish bin in the world. 

Currently, the ISS crew loads important scientific materials onto cargo modules for the return to Earth, but they also carry some waste materials. Shooting Star would pick up the trash and then guide itself into the atmosphere where it incinerates. Meanwhile, the Dream Chaser can go on with its mission and reserve more space for materials ground teams want, and not garbage. The spacecraft will land after each mission at the Kennedy Space Center’s Shuttle Landing Facility.

Behold, the fanciest space trash can in the world.

The Dream Chaser sports a few more abilities existing cargo vehicles lack. For example, it can become a short-term orbital platform with an optional inflatable module and additional power capacity. It has also been designed with NASA’s Lunar Gateway in mind. It will be able to dock with the station (with the help of an added satellite bus), allowing it to play a part in the Artemis program to return humans to the lunar surface. It will have to bid on NASA contracts before that happens, though. 

Sierra Nevada Corporation is scheduled to send the Dream Chaser on its first of at least six missions to the ISS in 2021. The company doesn’t have to worry about designing a rocket to launch the Dream Chaser, though. It’s partnered with United Launch Alliance to send the Dream Chaser into space atop Vulcan rockets. That rocket is expected to launch for the first time in mid-2021. 

Now read:

November 20th 2019, 8:25 am

AMD Launches Budget Athlon 3000G With $49 Overclocking Support

ExtremeTech

AMD’s new $50 Athlon 3000G is shipping now, with overclocking options and an integrated GPU that AMD believes give it a leg up against Intel in the budget segment. We mostly review higher-end products here at ET, but it’s important to keep an eye on the performance of more modest parts for budget builds. Whether the Athlon 3000G is a better chip than the Pentium G5400 depends on what kind of workloads you intend to throw at the system.

First, one important limitation. AMD doesn’t appear to have communicated this in its official PR when it announced the chip roughly two weeks ago, but the Athlon 3000G isn’t guaranteed to be supported in X570 motherboards. AMD is leaving this up to its partners. This seems unlikely to be a major problem — the X570 is a high-end chipset intended for high-end systems, while AMD’s B350, B450, or X470 chipsets would be better fits for this kind of product.

The Athlon 3000G is a 3.5GHz CPU with two cores, four threads, and a 4MB L3 cache. It’s based on AMD’s Zen+ CPU core and built on 12nm GlobalFoundries tech, with a Vega 3 GPU with 192 cores. It’s unlocked for overclocking and supports up to DDR4-2933 with a 35W TDP. AMD ships the CPU with a cooler rated for 65W. According to TechPowerUp, which put the CPU through its paces, that level of cooler enables plenty of overclocking. They were able to get their chip up to 4GHz, which sounds about right for budget 12nm GF silicon. You might get a bit higher if you hit the silicon lottery, but a 1.14x overclock on a $50 chip is a pretty solid deal. Compared with the Athlon 200 GE, which this chip replaces, the 3000G packs an extra 300MHz of CPU clock and 100MHz of GPU clock. It’s intended to compete against Intel chips like the Pentium G5400, which runs at 3.7GHz and costs $60, with a 58W TDP.

TechPowerUp compared the 3000G against the G5600, which is clocked about 5 percent faster than the G5400. That clock speed difference is enough to nudge several tests to Intel that might have otherwise broken for AMD, but it’s not a huge gap. Overclocking potential on the GPU was particularly good. First, the GPU responded well to core clock increases — a Vega 3 configuration is small enough that relying on desktop memory didn’t leave the platform overly bandwidth bound. The downside, of course, is that the reason it isn’t bandwidth bound is that it isn’t all that powerful in the first place.

Image by TechPowerUp

Some games, like BFV, run tolerably at 720p at minimal detail levels. The GPU comparison between the Vega 3 and Intel UHD 630 is easily the APU’s strongest performance area. Others show the CPU bracketing the performance of the G5600, with the un-overclocked Athlon 3000G being a bit slower than the G5600 and the overclocked variant narrowly faster:

Image by TechPowerUp

This is a fairly common outcome in this comparison. According to TechPowerUp, the G5600 is 1.22x faster than the non-overclocked 3000G, but that’s a claim that actually needed more explanation than it got within the article. Full quote below:

Overall performance is roughly 15% behind the Ryzen 3 1200 (which is a quad-core processor without SMT). Intel’s Pentium G5600 comes out 22% ahead, but does cost significantly more due to supply shortages and mark-ups. We didn’t test the Pentium G5400, but I would estimate it to be slightly slower than the 3000G.

It’s not clear at all why the G5400 would be “slightly slower” than the 3000G if the G5600 is 1.22x faster. This implies a >1.22x performance difference between two CPUs whose only difference is a small clock gap. I’d recommend checking TPU’s extensive list of benchmarks for workload-specific results because some of the tests with the largest gaps between Intel and AMD are also tests that may be less applicable to common workloads for a machine of this type. There’s nothing wrong with running atypical workloads on systems — I touched on this a bit in my article a few months back on real-world benchmarking — but it’s always a good idea to check performance in specific areas. Which chip suits you best may depend on the specific workloads you intend to run.

The strongest use-case for a CPU like this is if you need a little bit of multimedia or gaming horsepower but aren’t trying to build a major performance rig (or a system that will ever be mistaken for one). Added bonus points for the overclocking headroom — an extra 10 to 15 percent clock isn’t going to rewrite the record books, but squeezing a little bit of free performance out of CPUs is always fun. TechPowerUp gives the 3000G a nod overall, but recommends the i3-9100KF as an alternative for gamers who need ultra-affordable components and want to maximize cost-savings.

Now Read:

November 20th 2019, 8:25 am

NASA Confirms Water Vapor Erupting from Europa

ExtremeTech

Jupiter’s moon Europa has been the subject of intense study ever since the Voyager probes sent back images of its cracked, icy surface. There’s a strong possibility Europa has a subsurface liquid ocean. Some observations have shown geysers erupting from the moon, but we haven’t been able to verify there was water there until now. NASA’s Goddard Space Flight Center confirms water vapor bursting forth from the moon’s surface

Europa is the smallest of the “Galilean” moons — those discovered by Galileo in the 17th century. It wasn’t until the 20th century that we got our first close-up view of the moon, and we were in for a surprise. Europa is covered in “lineae,” the reddish streaks featuring prominently in all the images you’ve seen of the moon. Scientists believe these are cracks in the moon’s icy crust, through which water leaks out and re-freezes. The constant remodeling of the surface means Europa is the smoothest object in the solar system with no mountains or plateaus visible. Although, it might have icy spikes around the equator

Past observations from the Hubble telescope have shown liquid plumes erupting from Europa, similar to what we see on Saturn’s moon Enceladus. If Europa has a liquid water ocean under all that ice, you’d naturally expect the geyser to consist of water. Europa isn’t as active as Enceladus, but researchers scanned the planet 17 times between February 2016 and May 2017 hoping to catch sight of water. On one day, April 26, 2016, the W. M. Keck Observatory in Hawaii spotted the spectral fingerprint of water vapor rising above Europa. The detection was around 2,000 metric tons of water, which is enough to fill an Olympic-sized swimming pool. 

The confirmation of water vapor around Europa is important, but the team notes that there may be less water escaping from the moon than expected. Plumes may be rare and highly localized, suggesting that few large breaches open in the ice layer. The only way to know for sure what’s going on is to study Europa up close.

NASA plans to move forward with the Europa Clipper mission in 2025, which will be the first important planetary mission launched with the Space Launch System (SLS). Europa Clipper will conduct 45 flybys of the moon, using a suite of cameras, spectrometers, and radar to unravel the mysteries of this celestial body. If Europa does have a reservoir of liquid water inside, there’s a chance it could also host alien life. 

Now read:

November 19th 2019, 5:11 pm

Bill Gates-Backed Solar Startup Aims to Replace Fossil Fuels in Heavy Industry

ExtremeTech

One of the most difficult problems facing first-world countries as they search for more environmentally sustainable manufacturing methods is the issue of fossil fuel use in heavy industry. According to a Bill Gates-backed startup, Heliogen, it’s solved a major problem with industrial manufacturing and developed solar technology that can replace the conventional fossil fuels often used in these types of manufacturing.

Vox published a recent article on the problem of industrial heat, and I’d recommend giving it a look as an overview of the problem as it exists today. Heavy industry accounts for roughly 22 percent of global CO2 emissions. About 10 percent of global emissions come from the combustion required to manufacture goods like steel and cement. Cars and planes, in contrast, account for about 6 percent and 2 percent of global emission totals, respectively. Importantly, however — and Vox spends a fair bit of time on this issue — is that there is no known path forward for decarbonizing industrial heat.

Well. Not until now. Heliogen is claiming it can generate far more heat from solar operations than before. The startup claims it can use an array of mirrors and AI to focus sunlight using a concentrated solar plant. In and of itself, that’s nothing new — concentrating solar plants like Ivanpah have been operating for several years. Early problems with the plant also functioning as a sort of bird death ray have been ameliorated by adjustments to mirror positioning.

What’s special about Heliogen’s technology isn’t the broad strokes of the method; it’s the output temperature. Ivanpah runs at a temperature of 500C. Heliogen has developed a solar concentration technology capable of reaching 1,000C. And 1,000C is enough energy to power a number of industrial processes, including concrete production. The company’s long-term roadmap calls for commercializing 1,500C solar, which would allow for hydrogen and syngas production.

“We are rolling out technology that can beat the price of fossil fuels and also not make the CO2 emissions,” Bill Gross, Heliogen’s founder and CEO, told CNN Business. “And that’s really the holy grail.”

As a startup, Heliogen obviously isn’t handing out price tags for its systems just yet, but the ability to build industrial processing centers that rely on solar power for heat generation could be critical to meeting the demand for building materials like concrete and steel. Delivering these improvements at costs below that of traditional fossil fuels makes the transition to a greener power source an easy one. The technology could find a particular home in places like China, where high levels of air pollution caused by burning coal have been particularly harmful and the country has prioritized finding new sources of power from solar, wind, and natural gas rather than continuing to rely on coal.

Heliogen claims to have used AI to discover exactly how to align its mirrors for maximum temperatures and has stated it generates so much excess heat, it might be able to harness the same process to create hydrogen at scale. While hydrogen can theoretically be produced via electrolysis (and electrolysis can be powered by renewable energy), the current hydrogen economy runs on hydrogen produced by steam reforming of natural gas and doesn’t count as “green” in any context.

As for how to keep plants running when the sun isn’t shining, Heliogen has stated it will rely on storage systems of an unspecified type. The company has said it will announce its first customers soon, and that its use of AI and software constitutes a core business advantage that makes the underlying technology affordable (along with the decreased reliance on fossil fuels and their associated costs).

A discovery like this could be potentially huge if the technology scales. That Bill Gates has emerged as a backer is another sign Heliogen’s technology may be the real deal. Given that cement production alone accounts for 7 percent of global CO2 emissions, reducing industrial energy usage by building concentrated solar would make a meaningful dent in worldwide GHG emissions.

Now Read:

November 19th 2019, 4:13 pm

ET Dell Early Black Friday Deals: Inspiron and Vostro Desktops under $400, UltraSharp Monitors start

ExtremeTech

Today you can get a high-end curved display from Dell that features an ultrawide 3,440×1,440 resolution. Not only is the display heavily discounted, but it comes bundled with a $200 gift card.

Dell UltraSharp U3415W 34-Inch Curved 3,440×1,440 Monitor + $200 Gift Card ($699.99)

Dell’s high-end UltraSharp U3415W display features an ultra-wide IPS curved display panel with a resolution of 3,440×1,440. The display also has excellent color support that extends to cover 91 percent of the CIE1976 color gamut, and it’s currently marked down from $949.99 to just $699.99 from Dell. To further incentivize you into buying this display, Dell is also offering it with a $200 gift card.

Dell Inspiron 3671 Intel Core i5-9400 Desktop w/ 8GB DDR4 RAM and 1TB HDD ($369.99)

If you would prefer a desktop for work or home use then this system offers solid performance for your everyday tasks. The Intel Core i5-9400 processor that comes in this system has six CPU cores clocked up to 4.1GHz. This system also 8GB of DDR4 RAM and DVD-RW optical drive for reading discs. For a limited time, you can get this system marked down from $599.99 to $369.99 when you opt to purchase it with a financing option from Dell.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 19th 2019, 3:41 pm

Valve Ignites Entire Internet With Single Tweet About Half-Life

ExtremeTech

Last night, Valve announced it would finally pick up the tattered threads of its own longest-running game franchise — at least, if you squint a lot and are willing to be hopeful about the future. Half-Life Alyx, a new VR flagship title from Valve, will debut on Thursday, at 10 AM Pacific / 1 PM Eastern. The company announced the new game in a Tweet:

Half-Life Alyx is designed for PC VR systems from companies like Oculus and HTC, though presumably Windows Mixed Reality sets and Valve’s own Index will both be supported. This is the VR flagship title that’s been teased and rumored for several years at Valve; Ars reports that this is the VR game we were supposed to get in 2019, with the delay chalked up to infamous “Valve Time.” The game is a prequel, not a sequel — the long-desired, long-rumored, Half Life 3 remains a dead letter.

Under most circumstances, discovering that HLA is a prequel might kill a good portion of my interest; prequels don’t exactly have the best track record of fleshing out worlds or stories in the most compelling fashion. In this case, however, my interest is piqued for several reasons. First, we only know what happened in-between the events of Half-Life and Half-Life 2 in the broadest sense. Half-Life may have occurred on May 16, 2003 (the exact date is unknown, see this discussion of the timeline for details) and Gordon was in stasis for nearly two decades. HL2, therefore, occurs somewhere between 2015 and 2029, depending on exactly how long Gordon was on ice. We know that during this time, the Combine conquered Earth (during the Seven Hour War) and the initial human / Vortigaunt resistance formed.

Alyx Vance, with robot companion Dog. Image by the Half-Life Wikia

What could Half-Life Alyx explore? The creation of Dog, her sidekick in the first game, the initial years of the human resistance, and the thrilling transformation of Barney from a former security guard to intrepid resistance fighter (Half-Life: Barney doesn’t have the same ring, no matter how you slice it). If we really want to get crazy, we could imagine potential Portal crossovers (at least one that clarifies a bit more of Aperture Science’s impact on the world) or an answer to what happened to Adrian Shephard in the events of Half-Life: Opposing Force. Valve has never clarified what the G-Man did with Shephard, so this plot point could theoretically be revisited.

HLA could sport its own unique interface tool for changing how players interact with the game world, according to Ars. If you’ve played HL2, you’re likely aware that the game features a unique weapon that allows you to manipulate the environment–the Gravity Gun. At the time, using the Gravity Gun to manipulate the world around you felt borderline revolutionary, though not many titles have chosen to continue down the path of emphasizing physical interactions and real-world object manipulation in the same fashion.

According to multiple sources, the Gravity Gun of HL2 will be followed by a new type of object manipulation in HLA, using the so-called “Grabbity Gloves,” a name I find simultaneously brilliant and annoying. Supposedly these are used for “point-and-snag” object manipulation, rather than requiring players to walk over and physically pick up objects off the floor. These may favor the controllers used on the Valve Index, which leave your hands free to move. There are rumors that the game could basically be a master-class in how to build environments for VR players.

VR has been looking for a killer app for a long time. It seems absurd to think that Valve — a company known far more for selling games than building anything recent at this point — could inject the kind of killer app that everyone’s been hoping for. This could also be the first ground-up use of the Source 2 engine for a new FPS, thus far the engine has only been used for DOTA-related titles and for Artifact. As for whether the game will be good or a worthwhile sequel to one of the most famous and unfinished gaming franchises of all time, probably best not to speculate. Of all the questions in play, that one seems the most difficult to answer.

Now Read:

November 19th 2019, 2:56 pm

Logitech’s New Adaptive Gaming Kit Builds on Microsoft Xbox Accessibility

ExtremeTech

Last year, Microsoft launched its Xbox Adaptive Controller. The peripheral, which is designed for gamers with various accessibility challenges, was intended to serve as a central hub for the specialized components some gamers need to interact with titles more effectively. While the XAC was hailed for being affordable and innovative, it only provided two buttons and a D-pad, with 19 separate ports for other buttons, dials, and peripherals to connect.

Microsoft’s rationale for the design was simple: It isn’t possible to build a controller (or even a handful of controllers) that could cover every conceivable use-case that various people require in order to play. Rather than trying to build peripherals for specific types of impairment, why not build a common platform that gamers could use to construct their own controllers?

The XAC, however, isn’t a complete platform in its own right. When we talked to Logitech, the company stressed that it wasn’t really enough to buy an Xbox Adaptive Controller and start playing; players needed to purchase additional buttons and devices, which can run $40 – $75 per individual button. Logitech has developed its own add-on kit for the XAC, the Logitech Adaptive Gaming Kit. The $99 kit ships with 12 discrete buttons that plug into an XAC’s 3.5mm jacks. The 12 buttons are:

3x round buttons (2.6-inch diameter)
3x round buttons (1.4-inch diameter)
4x touch buttons
2x variable triggers

Click to enlarge

Twelve buttons for $99 is an excellent deal if Logitech’s quality is as good as advertised. The kit also comes with two configurable game mats with a hook-and-loop system for mounting the buttons on a board. According to Logitech, even the packaging design is specifically intended to be accessible, with exterior tape that pulls away in a single easy motion and no heat-sealed plastic packaging inside the box to make it difficult for users to assemble the hardware. There are labels included for each button so that gamers can label them according to the controller button they replace. Even the color of the mats was a deliberate choice, to make certain that gamers with low vision could properly distinguish between the buttons and the background of the mounting mat. Logitech has published a video about the design and work that went into engineering the Adaptive Gaming Kit, embedded below:

Logitech spent several years on the project, paring away at margins and working to bring down the overall product price. During our discussion with the company, Logitech Product Manager Mark Starrett walked us through various aspects of how the buttons were designed, including the work that went into ensuring they pressed equally well from any angle or when off-center, or how variable triggers were important to bringing racing games to the Xbox Adaptive Controller.

I don’t want to pass verdict on a product I haven’t tested, but if Logitech’s hardware lives up to its billing, the company has done some real good here — and hopefully helped make gaming more accessible to a group of people who wouldn’t otherwise have been able to afford it.

Now Read:

November 19th 2019, 11:06 am

Google Has Added 10 More Stadia Games in Time for Launch

ExtremeTech

As recently as last week, Google’s position was that it would launch Stadia with a carefully curated list of just 12 games. The list did include some big names like Destiny 2, Red Dead Redemption 2, and Mortal Kombat 11, but it’s not just about the quality. Gamers on any other platform have their pick of many more titles, so Google is changing course. It has almost doubled the number of launch games, adding ten more titles to the lineup just in time for public availability. 

Here’s the full list of Stadia launch titles, newly added games in bold. The new games fill in some gaps in the store, but we’re still only talking about 22 titles total. 

Notably, some of the new launch games are of the “simulation” variety. While our experience with Stadia’s performance was very positive, it’s always possible the service will struggle to keep up with the flood of new users on launch day. A simulation game will play fine even if Stadia lag a few frames. The shooters on the platform won’t be so forgiving, and there are several more of those in the new batch. 

Some of the new games were not slated for launch until 2020, based on Google’s rough timeline last week. Google’s Phil Harrison suggests that’s because Google was too conservative in its estimates and its partners managed to add Stadia support sooner than expected. Although, that’s not the sort of turnaround that happens in the space of a few days. More likely, Google saw the response to its scant initial offerings and opted to do away with its artificially drawn-out release schedule. 

All 22 Stadia launch titles are available for purchase today. Almost none of them were available during our review period. Currently, Stadia is only open to those who ordered the Founder’s Edition kit, but some of those won’t ship until later this month or early next. Founders all get three months of Stadia Pro, but you don’t lose access to games when that expires. You can still play Stadia at 1080p or upgrade to a Pro subscription for $10 per month. 

Now read:

November 19th 2019, 10:06 am

Hands on With AMD’s New Radeon Pro W5700 Workstation GPU

ExtremeTech

While gaming GPUs get most of the attention, workstation graphics cards are essential for many kinds of engineering, scientific, AI, and multimedia tasks. So I was eager to get a chance to test out a pre-release version of AMD’s new Radeon Pro W5700 workstation GPU ($799).

Radeon Pro W5700 By the Numbers

First, it was a relief to get a new GPU that sticks to a similar power envelope to prior versions. In my case, it was simple to swap it in for an Nvidia GTX 1080, because it doesn’t require more power and uses similar PCIe power plugs (8 + 6 pins versus the 8 + 8 pins on my eVGA 1080 FTW). To help it fit connectors for six monitors, the back of the card has five mini-DisplayPort 1.4 ports and one USB-C connector. Fortunately, it also comes with a couple of DisplayPort dongles and a DVI dongle. The mini-DisplayPort ports support DSC, so they can drive displays up to 8K @ 60fps. The USB-C port can also supply up to 15 watts of power.

Inside, the “Navi” GPU is built on AMD’s new RDNA architecture using a state-of-the-art 7nm process to cram its 10.3 billion transistors onto a 251mm die. The GPU has 36 compute units, 2304 stream processors, and a base clock of 1183 MHz, along with 8GB of GDDR6 RAM. As far as performance, AMD claims up to .56 TFLOPS at FP64, 8.89 at FP32, and 17.8 at FP16. Those specs put it behind the WX 8200, except for an improved fill rate and reduced power consumption (205 watts maximum versus 230 watts). However, the W5700 benefits from AMD’s new RDNA, which the company says allows it to execute 25 percent more instructions per clock cycle. The W5700 is also ready for PCIe 4, with a theoretical bandwidth of up to 24.6 GBps available.

Hardware encoding and decoding support is integrated through the Radeon Media Engine and includes encoding for H.264 and H.265, as well as decoding for those codecs and VP9.

Working With Radeon Software

I’m usually pretty skeptical of the utilities bundled with GPUs, but AMD includes some interesting ones with the W5700. First is ReLive, which allows the streaming of VR content to a wireless VR headset. I didn’t have one around to test it out personally, but I can see that it provides for a nice alternative to the thicket of wires that connect my Oculus now. There is also Image Boost, a clever way to increase the apparent sharpness of non-4K monitors. Turning it on lets you choose a virtual display resolution (up to 4K) and allow the GPU to rescale results. I was testing on a system with one 4K and one 1920 x 1200 monitor, so I used the feature on the smaller monitor. After enabling it, I could more easily read small text on the display. Images did, in fact, appear somewhat sharper.

Radeon Pro Image Boost increases the apparent resolution of sub-4K displays

Running Workstation-class Applications

AMD provided us with several sets of relevant benchmark results for the W5700. The first set compares it to two older GPUs that it is designed to replace — the Nvidia’s Quadro P4000 and AMD’s own Radeon Pro WX 7100 — and shows solid performance improvements across the board:

If your GPU is more than a year or two old there are some impressive performance gains from upgrading to a W5700

So, as you might expect, you can get more for your money now than you could two years ago. For those evaluating current model cards, AMD compared the W5700 with the similarly priced current model Nvidia Quadro RTX 4000 GPU. It certainly holds its own, but doesn’t blow it away:

AMD’s Radeon Pro W5700 goes toe-to-toe with Nvidia’s similarly-priced Quadro RTX 4000

One area where AMD believes the W5700 shines compared with the Quadro RTX 4000 is efficiency when both the CPU and GPU are in use. According to their benchmarks, when running both a CPU intensive task like rendering, along with floating-point, the W5700 achieves a much higher frame rate. I look forward to testing that for myself in video processing (once Neat Video is able to add support for the W5700) where some tasks like noise reduction can run on the GPU while encoding runs on the CPU.

One impressive result from W5700 benchmarks is improved efficiency when both GPU and CPU tasks are being run in parallel

The W5700 Is for Graphics, Not AI

AMD has done an excellent job of getting broad design-industry support for its ProRender software and the W5700, including headline applications like 3DS Max, Maya, Solidworks, Creo, Blender, Cinema4D, and others. So design tool users should feel right at home with a W5700, and be assured of getting excellent results for a value-priced workstation GPU.

The story is different for those who need GPGPU solutions like machine learning. It’s certainly not a secret that AMD has been playing catch up in supporting the tools and frameworks that AI developers need. It’s been making progress, but a large number of projects are still built natively on Nvidia’s CUDA, and won’t run on AMD’s GPUs. In some cases, there are OpenCL alternatives, especially for those running Linux. But for Windows users, the news isn’t as good. Google’s popular Tensorflow requires CUDA on Windows, for example. In my case, my neural network code is all built on either Tensorflow or Mathematica — which also requires CUDA — so I wasn’t able to benchmark the W5700’s training performance compared to my existing Nvidia GPUs.

When I asked AMD for advice they explained that their desktop GPUs are not intended for machine learning tasks, just for graphics, and that for machine learning I should look to their datacenter offerings. I’m sure that makes sense to them strategically, but it means that if you want your workstation to do double duty running both traditional design tools and enable some machine learning projects, an AMD GPU is probably not a good choice.

Should You Upgrade to a Radeon Pro W5700?

If your productivity is bottlenecked by rendering performance, and your GPU is more than a year or two old, you’ll clearly see some noticeable speedups by upgrading to a W5700. On the other hand, if you already have a current model workstation GPU, the W5700 isn’t going to blow you away. In either case, if AI is an important part of your workload, be cautious about upgrading unless you can assure yourself that your tools will run on the W5700. If you’re shopping for a workstation GPU, the $799 price tag of the W5700 puts it very close to Nvidia’s Quadro RTX 4000. And at least according to AMD’s benchmarks, it offers some advantages in floating-point and especially in combined CPU-plus-GPU performance.

Now Read:

November 19th 2019, 9:20 am

SC19: Intel Unveils New GPU Stack, oneAPI Development Effort

ExtremeTech

Intel made some significant announcements at Supercomputing 19 on Sunday, including new details on its Xe GPU architecture and a programming model it calls oneAPI. Both products are critical to the company’s future plans; Xe represents Intel’s first-ever push into data center GPUs and its first discrete GPU in nearly a decade. OneAPI is part of Intel’s effort to expand both its total addressable market and to unify the compute space developers use to target its products.

The goal of OneAPI is to present a single unified development target for the four major types of workloads (scalar, vector, matrix, spatial) and the various components that Intel manufactures (FPGAs, CPUs, GPUs, and other AI accelerators via products through companies like Movidius and Mobileye). One of the major goals of OneAPI is to abstract away the work of optimizing for any single specific architecture, allowing the developer to focus on writing code that runs on any underlying supported hardware.

The “write once, run anywhere” idea that Intel is going for with OneAPI is clearly reminiscent of Java, but there are some major differences between the two. Java compiles to bytecode and runs inside a JVM, while oneAPI is a set of libraries. Those libraries translate hardware-agnostic API calls into more specific low-level code that runs on whatever target hardware is present in the system. OneAPI isn’t completely without targeting — users are expected to define whether they’re writing code for an FPGA, CPU, or GPU, for example — but anything higher should be abstracted away.

Ponte Vecchio: Intel’s First Data Center GPU

Intel also unveiled details on Ponte Vecchio, its first data center and HPC GPU. Ponte Vecchio is a medieval bridge in Florence. It isn’t clear why Intel picked this particular naming convention; the company may have opted for famous bridges as a codename source. ServeTheHome has extensive details on Ponte Vecchio, which is optimized more towards compute workloads and less for graphics. The design uses variable vector width and can handle both SIMT and SIMD data, offering top performance when both modes are used.

PV can scale to thousands of EUs (firmer figures were not offered) and supports data types like INT8, bfloat16, and FP16. Xe is said to offer a 40x increase in double-precision floating point per execution unit compared with Intel’s existing integrated graphics. Xe will use CXL for a coherent interconnect between CPU and GPU. The GPU also includes something called a “Rambo” cache connected to the XEMF (Xe Memory Fabric).

Image by ServeTheHome

Intel believes the cache is essential to its plan for improving performance when using large matrices. Intel’s new interconnects are both in play on this project, with EMIB used for HBM and Foveros used for Rambo. Ponte Vecchio will be built on Intel’s 7nm process. This may be the GPU that Intel expects to debut on that node when it’s ready for manufacturing.

OneAPI and Xe are both critical components of Intel’s broad future approach to computing. The company has articulated a multi-faceted future that leverages FPGAs, CPUs, GPUs, and other accelerators from the Loihi and NNP-I/NNP-T families to create an overall product ecosystem. We’ll start to see how those plays are coming together in 2020, as consumer Xe moves into production and next-generation products built on 10nm ship in greater volume.

Now Read:

November 19th 2019, 8:20 am

Ford Took Over Tesla’s Electric Avenue for Mustang Mach-E Introduction

ExtremeTech

Bill Ford Jr., right, at media scrum.

Can this be more than the merest of coincidences? (Probably. Yeah.) Ford held its biggest car introduction of 2019 Sunday night: the unveiling of the Mustang Mach-E battery-electric SUV. Hundreds of media, analysts and bloggers gathered to see the car firsthand and hear Ford’s logic in transplanting the name from one of America’s most iconic sports cars and extending it onto an EV crossover — sporty, for sure, and fast, with one version rated for a Mustang-like 0-60 mph time of about 3.5 seconds. But a crossover.

Equally as interesting to some was Ford’s choice of the venue, Hawthorne Municipal Airport, four miles southeast of LAX. Why interesting: Because the 80-acre tract with its single short runway is home to Tesla Design studios and Elon Musk’s Space Exploration Technologies Corp., or SpaceX.

Southeast section of Hawthorne Municipal Airport. Ford rented an event space (under the Ford annotation) alongside the 4,956-foot runway for the Mach-E rollout. Look who the neighbors are: Tesla Design studio and SpaceX. The studio has been home to many Tesla product launches. (Image: Google)

Tesla over the years has used the design studio for its own vehicle introductions that include media, Tesla owners/intenders, and friends of the marque. Tesla CEO Musk unveiled the Model 3 there in March 2016.

This year, it was the site for the Tesla Model Y compact crossover/SUV rollout March 14. Then Friday, Nov. 1, Musk used the massive SpaceX facility to say he’s taking a break from Twitter. The hiatus last until Monday.

There’s also a Tesla Supercharger stand, hard to find, tucked between the SpaceX building (X on the roof) and the Tesla Design building (the upside-down Tesla logo painted on the roof).

Ford CEO Jim Hackett with actor and one-time Ford plant employee Idris Elba, the show’s MC.

Ford didn’t have a lot to say about the location. At the press scrum after the formalities concluded, FoMoCo executive chairman Bill Ford Jr. was asked about the circumstances of the location, what with Ford being in Elon-land. Ford smiled and said, “That’s a coincidence.” And smiled some more.

The word around the airport was that Musk was not amused. Electrek quoted a pilot who flies in and out of KHHR as saying the airport was awash in stories about Musk’s disapproval. Musk has a private jet that is parked in a hangar adjacent to the 15 | 40 Productions event space that was snapped up by Ford for the big show Monday and media pre-briefings over the weekend. Although to be fair, the airport is so crowded, everything is next to everything at Hawthorne. When Ford ran test drives for the media Friday, the taxiway was closed down for a minute or two at a time so Ford could do 0-60 runs, and a line of slalom cones was placed in an alley between aircraft hangers.

Knowing Musk turned cranky because of his temporary neighbors from Dearborn no doubt made the cost of putting on a big show all the more delicious for Ford.

Now read:

November 19th 2019, 8:20 am

ET Deals: Samsung Early Black Friday Deals: 1TB Samsung 970 EVO SSD $150, 2TB Samsung 860 EVO SSD $2

ExtremeTech

Pick up one of Samsung’s 970 Evo 1TB SSDs and kiss your old HDD goodbye. With one or two of these large SSDs, who really needs to keep a hard drive?

Samsung 970 Evo 1TB M.2 NVMe SSD ($149.99)

Reading data at 3,500MB/s, this SSD hits the limits of what the M.2 interface is capable of when connected using PCI-E 3.0 lanes. With a total of 1TB of storage capacity, this drive removes any need to have a second drive to store files as it can hold more data than the average user typically needs. The drive was built using Samsung’s V-NAND 3-bit MLC NAND, which offers excellent performance. The drive is also rated to last for up to 1.5 million hours before failing. Right now you can get it from Amazon marked down from $169.99 to $149.99.

Samsung The Frame QN55LS03R 55-Inch 4K QLED Smart TV ($1,097.99)

Samsung designed this TV to be more than just a TV. The product was designed to mimic the look of a picture frame, and it can easily be hung on the wall like a piece of art. When not in use to watch television, the screen has a special art mode that gives it the false appearance of having a true piece of art hanging on your wall. Right now it’s marked down from Walmart from $1,999.00 to $1,097.99.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

 

November 18th 2019, 5:10 pm

Oculus Debuts PC Support for Quest Via USB-C Cable

ExtremeTech

A few months ago, Oculus declared that its Oculus Quest — the standalone VR headset intended for quick gaming sessions based on the Snapdragon 835 — would actually be receiving support for PC VR when hooked to a compliant PC via VR cable. The Oculus Quest can now be connected via USB 3 Type-C cable to a PC, provided that the PC also supports at least USB 3.0. You may need a specific cable for this if you haven’t purchased one already, and I’ve ordered the cable to compare and contrast Quest against the first-generation Oculus Rift I also own.

At Oculus Connect this year, Oculus announced that it would provide wired USB-C cable support for the Quest, which previously relied solely on its Snapdragon 835 to provide the horsepower for running VR games. The Oculus Quest isn’t supposed to be as good of a VR experience as the Rift S that Oculus also sells, so desktop owners supposedly won’t have anything to worry about as far as buyer’s remorse relative to Quest. There are also some limits on Quest support during the beta period for the Oculus Link service.

The following screenshot is from the Oculus Quest site. The company recommends a Core i5-4590 or AMD Ryzen 5 1500X, with 8GB of RAM, Windows 10, and at least one USB 3.0 port.

Oculus Quest support matrix

The lack of official support for AMD GPUs means compatibility “isn’t guaranteed,” but it’s not clear if that means support flatly won’t work, or if the experience isn’t optimized. Given that “non-optimized” could be code for “Laggy vomit simulator with uncomfortably realistic performance of core gaming function following unexpected simulation termination,” we recommend being cautious when experimenting with unsupported GPUs. Oculus isn’t launching their formal, official Oculus Link Headset Cable yet — the $80 cable is supposed to provide 5-meter lengths and a right-handed connector to minimize cord dangling. With that said, you’ll want to be careful when using the Quest like this — I’ve tried using my Quest in such fashion, and even with a right-angle cable for power, it’s easy to snag the cable when swinging the handheld controllers.

Artist’s depiction. Colored light trails sold separately. Living room decor will not transform to match image. Actual player not guaranteed to look this cool.

According to Oculus, Quest software sales have been much stronger than the Rift ecosystem, and providing this kind of value-added capability to Quest could help the peripheral gain more market share. Overall user experience on Rift products has been fairly good and it’s unlikely Oculus would enable this capability at all if it couldn’t deliver a suitable experience, but cable position may mean the Quest is a better fit for sit-down VR when tethered and for movement-based VR when in wireless mode. We’ll see how it compares with the Rift over a tether. Assuming Quest’s higher software sales ($20M in four months, versus $80M earned from Rift in three years) translates to high product shipments, Oculus could grow the footprint of “PC VR” significantly with a move like this.

We’ve talked before about whether PC gamers would invest in VR to drive the platform’s growth. It will be interesting to see if the Quest produces any sign of the reverse. In theory, a good enough experience on the standalone Quest or with another VR product that’s capable of PC tethering or standalone play could convince gamers to upgrade their PCs in order to play the larger and more complex VR titles that require the platform. In theory — given a whole lot of luck and some killer apps — virtual reality could one day serve to onboard people to PC gaming as well as the other way around.

Will it happen? I’m not taking bets. But it’s not the craziest idea. Upload VR has details on getting your Quest and PC software updated to support the experience.

Now Read:

November 18th 2019, 4:53 pm

This Simple Animation Platform Is Used By NASA, Google, & More

ExtremeTech

Animation makes everything more exciting — from Hollywood movies to marketing and design briefs. Of course, not just everybody can be an animator, it takes artistic talent and years of training. Or it takes a subscription to Animatron Studio.

Animatron Studio is used by the likes of NASA, Google, Facebook, Amazon, and Disney to create beautiful animated videos without extreme technical knowledge. This platform uses a WYSIWYG editor to make animation design simple, letting you create explainer videos, design HTML5 banners, make stunning presentations, and much more without ever drawing a line.

Their library features thousands of free, pre-animated characters, backgrounds, and props to choose from and also allows you to import, place, and edit your own graphics, photos, audio, and video content seamlessly. Everything is completely customizable and easily exported to HTML5, SVG, GIF, or video formats for universal playback. From mobile-friendly, entertaining ads to children’s’ educational content, there’s virtually nothing you can’t do with Animatron.

A one-year subscription to Animatron Studio’s Prop Plan typically retails for $360, but you can get one today for just $29.99.

November 18th 2019, 4:10 pm

Review: Google Stadia Might Be the Future of Gaming, but We’re Not There Yet

ExtremeTech

One company or another has been trying to sell consumers on cloud gaming for the last few years, but no one has made it stick. Not only do you need the clout to get game publishers on board, but you also need the network infrastructure to make the experience reliable for people all over the world. Google might have all the pieces in place to make Stadia work, but that doesn’t necessarily mean you should start buying your games on Stadia right now. You need a lot of bandwidth for the best Stadia experience, and hardware support will be skimpy at launch. Stadia has potential, but that’s what we’ve been saying about game streaming for nigh on a decade at this point. 

Getting Started

Eventually, Stadia will work on a variety of devices including most smartphones and Chromecasts. However, the launch support is limited to just Google’s Pixel phones, Chrome browsers, and the Chromecast Ultra — specifically, the Chromecast Ultra that comes with Stadia. Google won’t roll out the Stadia update to other Chromecast units until a later date. Additional phone support is also coming on some vague future date. 

You need the Stadia app on your phone to get started, even if you don’t plan to play games on the phone. From there, you configure the controller, which connects directly to the internet rather than going through your streaming device. You can also “pair” the controller with the Chromecast Ultra so it can launch Stadia with a button press. This is a clunky experience, though. It took me about five tries to get the devices linked. You should be able to link a controller simply by inputting a series of button presses shown on your Chromecast’s ambient screen. 

Stadia is tied to your Google account, and you can’t move your library to another login. So, make sure you choose your preferred account before you purchase anything. Google has connected Stadia to the same back-end as Google Play, so you earn Play Points when you buy games. Once you’re set up and buy some games (Destiny 2 and Samurai Showdown come with the Founder’s bundle), you can launch them from the mobile app or the TV by selecting a screen. Your controller should automatically connect to the service over Wi-Fi, so you don’t need to fumble with Bluetooth or additional wireless dongles. 

Playing Games

I’ve tried most of the major game streaming services that have popped up over the years, and Stadia is the first one that sufficiently replicates a local gaming experience. Destiny 2 runs smoothly at 4k60 on Stadia with HDR enabled, which is truly impressive — my gaming desktop PC would probably struggle to do that. It’s easy to forget the game is being rendered on a distant server and streamed to your screen. I haven’t noticed any lag at all. 

There are a few clear advantages to having games render in the cloud. For example, Stadia keeps your game instances running for several minutes after you close the app. So, you can start a game on your TV or phone, and then switch seamlessly to the other and continue playing where you left off. The games are, of course, vastly more advanced than anything you could play on a low-power device like a phone or Chromebook, and the experience is almost identical to what you’d get on a powerful gaming PC. 

You need a USB-C cable to play Stadia games on a phone right now, which is very silly.

In its prettiest form, Stadia will absolutely devour your bandwidth. Google recommends 35Mbps down for 4k60 HDR gameplay, and that’s very close to what I’m seeing on my end. If you’ve got a monthly data cap, Stadia could blow right through it. After all, you are streaming 4K video every second you spend in a game even if it’s paused, but you can turn down the quality in the Stadia settings. 

Mortal Kombat 11 on Stadia.

The Stadia controller is easily the equal of the Xbox One or PS4 versions. The thumbsticks have good resistance and smoothness, and the buttons are tactile but not too loud. I also enjoy the convenient screen capture button next to the main button cluster. I personally prefer the staggered thumbstick layout of the Xbox controller, but PS4 gamers will adapt to the symmetrical sticks on the Stadia controller easily. One strange foible: the controller doesn’t work wirelessly with a phone yet. You have to plug in a USB-C cable. 

Is Stadia Right for You?

If you’re reading this, the odds are you won’t want Stadia — at least not yet. The selection of games is limited, and the prices are usually higher than what you’d get buying from Steam or Amazon. It also requires a fast internet connection. The highest-quality streaming requires a Stadia Pro subscription, as well. 

Stadia performed extremely well for me during testing. The experience I’ve had with Stadia might not match what everyone gets after launch — there were very few people on the servers during the review period, so performance might degrade when the floodgates open. That said, Google is one of the few companies with the infrastructure to support thousands of cloud gaming connections. 

Kine is a quirky puzzle game available on Stadia at launch.

If Stadia can avoid growing pains as more players come online, I can safely say it will provide the best cloud gaming experience available. Once additional device support rolls out, Stadia could be an ideal way to play your games wherever you happen to be. Playing Mortal Kombat 11 on your phone is pretty undeniably cool. I could see Stadia being great for frequent travelers and those who don’t want to deal with maintaining a gaming PC. However, it won’t make you want to dump your console or gaming PC if you’ve already invested in the hardware.

Now read:

November 18th 2019, 4:10 pm

Ford Mach-E: It’s a Mustang, an EV (Yes), an SUV (Gasp), Very Quick, and the Future of Ford

ExtremeTech

LOS ANGELES — The 2021 Mustang Mach-E announced here Sunday night in Elon Musk’s backyard is a compact crossover that runs on electricity, for as much as 300 miles. There is no gasoline-engine option. More than that, Mach-E is Ford’s bet on the future shifting away from combustion engines, toward electrification, and toward driver assistance that becomes self-driving if the driver wants it and the technology is ready.

Mach-E is also the answer to the question of what Jim Hackett has been doing at Ford since taking over as CEO 30 months ago. Turns out he was rebuilding the company and chose to use the most beloved Ford name of the past half-century to be Ford’s stalking horse for electrification. It worked: There would not have been 500 journalists and fans in attendance if this was the 2021 Ford Escape Electric. Which is what this new vehicle might have been called before Hackett found Ford’s EV plans to be timid.

If this electrification push sounds like Ford is going all California and sucking up to environmentalists, relax. There will be plenty of combustion engine cars for decades to come, including traditional 2+2 Mustang sport coupes. Just not forever. And be warned that an electrified Ford F-150 is in the offing. Hashtag OMG.

What Exactly Is the Mach-E?

Here’s the nitty-gritty on the Ford Mustang Mach-E. It is a compact SUV able to fit four or five adults fairly comfortably in both rows of seats. It bears some Mustang styling cues from some angles. It. Does. Not. Replace. The. Mustang. 2+2. Sports. Car.

It is in prototype form now, solid and numerous enough to allow journalists to take part in test drives, but not numerous enough to let them drive on the test drives. The first versions arrive in fall 2020 and all five variants will be available by spring 2020. They will cost $45,000 to $65,000 plus options, every Mach-E will have significant driver assists, and some higher trims will have Level 2 self-driving on par with Nissan ProPilot Assist, possibly on par with Cadillac Super Cruise, the industry standard.

There are two battery sizes initially, driving ranges of 200-300 miles, rear or all-wheel-drive, and a performance edition that will blow the socks off the vast majority of the 10 million Mustangs built since 1964. None of which will make traditionalist Mustang fanboys any happier — it’s a freakin’ SUV, many will say (Ford agrees) — but at least they can commiserate with the Corvette fans appalled to see their pride and joy will now be a mid-engine sports car. As if that’s a bad thing.

On a brief ride around Hawthorne Airport, the prototype rode well on city streets. There was an occasional harsh jolt when the Mach E traversed a piece of less-than-smooth California highway. Ford probably should offer an adaptive suspension but will not. Inside the snug airport on a closed-off taxiway, the Mach E charged through a slalom course with very little body roll, and a 0-60 mph run that wasn’t timed suggests Ford will be able to achieve sub-4-second results with the more powerful versions.

It is reasonably roomy in the back seat, about what you’d expect from the current generation of compact SUVs that don’t cramp second-row passengers. Here’s how the Mach-E compares. While a vehicle’s size class is usually based on cubic feet of passenger and luggage space, a vehicle that’s 180-190 inches long is generally seen as a compact vehicle.

Ford Mustang Mach-E: 186″ long (117″ wheelbase) x 74″ wide x 63″ high, 32.5 cu. ft. cargo space

Ford Mustang: 188″ L (107″ WB) x 75″ W x 55 H, 13.5 cu. ft trunk

Ford Escape: 181″ L (107″ WB) x 74″ W x 66″ H, 33.5 cu. ft. cargo

In 2003, the tears of Porsche traditionalists fell heavily and stained their tassled loafers when the Porsche Cayenne landed here. SUVs now are two-thirds or Porsche’s US sales (sales shown for 2019’s first three quarters).

Will an Electric Mustang Ruin the Pony Car Mystique? (No)

Can Mustang fans gain solace from fans of competing sporting marques? Others have suffered imagined slights and survived: Porschephiles feared Armageddon in 2003 when the Cayenne SUV joined the family of Porsche sports cars. The Cayenne and Macan, the compact SUV that arrived in 2015, rank 1-2 in Porsche sales in the US while the cars run 3-4-5 (chart above). Without SUVs Porsche might be little more than an R&D company for other automakers.

At BMW, SUVs are 47 percent of sales through the first three quarters of 2019, up from 36 percent the year before, as sedan sales shrink in the US. And Chevrolet Corvette fans have their own crisis, that of the engine being moved from in front to midships, just behind the driver, for the C8 Corvette due early 2020.

No, none of these comparison cars are battery electric vehicles. But as those sporty brands pondered the future — in this case embracing SUVs and crossovers as legitimate upscale/sport vehicles — they decided the worst change be may the change where you stand still and watch the world sail past you.

Electrification won’t make a Mustang worse. It makes the Mustang faster. “We’re looking at mid-three-second 0-60 [mph] times,” said executive chairman Bill Ford Jr., speaking of the Mach-E Mustang, which looks like a Mustang from some angles and also bears some resemblance to a BMW X4/X6 or Mercedes-Benz GLC coupe/SUV. By the way, there could in the future be a traditional Mustang 2+2 coupe powered by electricity.

As for performance EVs, veteran journalists who’ve driven early Porsche Taycans, the $150,000 EV sports car, generally say it is the quickest, best-handling Porsche. Ever.

Originally, Ford planned to build a front-drive “compliance car.” Then-new CEO Jim Hackett pushed Ford to be bolder. Black lines were the original design. Red lines are the Mach-E: lower and longer hood, sleeker windshield, longer wheelbase. And plenty of room for people and cargo.

From Compliance Vehicle to Exciting EV

If this was a Ford Escape EV intro, The New York Times wouldn’t have put the story on Page 1 Monday.

Making this vehicle a Mustang is a stroke of genius by Ford management. (For those who disagree: that’s why we have the Disqus comments button below.) What became the Mustang Mach-E started out as, essentially, an electric Ford Escape, a “compliance vehicle” to keep Ford on the good side of environmentalists and government. Pretty exciting, huh? CEO Hackett early in his tenure reportedly decided an Escape EV wasn’t inspiring enough and pushed the company to shift direction well into the design process and create a sleeker battery-electric crossover/SUV that would carry the Mustang name. Focus, for its part, has three versions with the 2019 fourth generation: gasoline combustion engine, hybrid, and (for 2020) plug-in hybrid with up to 30 miles of battery-electric range.

To get the car to market in 2020, engineers and designers on the project (called Team Edison) did more computer modeling and skipped some clay models (time-consuming and costly) until later in the design phase. To emulate the volume knob that would be bonded to the bottom of the 15.5-inch center stack display, they used pieces of cardboard and a Keurig single-serve coffee cup for an early mockup. (Ergonomics warming: The actual volume knob is just as slippery to the touch. Too many automakers have given up on grippy rubber-covered knobs.) From early sketches to clay models to corporate approval to ordering the projection tools, the period was about a year.

Ford’s research turned up buyer confusion: Many believe electric vehicles need gasoline, caused by the uncertainty of what’s a battery electric vehicle versus plug-in hybrid versus hybrid. They doubted how well an EV would be in snow or cold weather. They suspected automakers’ hearts weren’t 100 percent behind EVs and believed some efforts were to comply with rules and statures. They feared being stranded, not knowing how to find charging stations, and waiting a long time to recharge the batteries. All that led Ford to conclude it needed to work with a loved nameplate, which meant Mustang or F-150, and Mustang is what was chosen.

Did it work? Buyers will decide in a year. Meanwhile, the Mach-E intro as a marketing event was a smashing success.

Ford opened its configuration/order site on Sunday. A $500 deposit, refundable, holds your place in line.

The cockpit of the 2021 Mustang Mach-E.

Trim Lines and Features

These are the five trim lines or model variants. Batteries are 75.7 kWh hour standard range (SR) or 98.8 kWh extended range (ER). All cars have a big electric motor in back; all-wheel-drive Mach-Es have a second, smaller motor up front. Power ranges from 255 to 459 hp, torque from 306 to 612 pound-feet. The range runs from 210 to 300 miles. Peak horsepower, torque, and acceleration all depend on whether you have rear- or all-wheel-drive, and even on the battery size. The bigger battery with 12, not 10, battery modules can deliver more peak power to the motors. Most trim lines will do 0-60 mph in the mid-5 second range with all-wheel-drive, about a second more with rear-drive only. The GT targets mid-3-second acceleration. Ford’s pricing notes potential tax savings, including the federal $7,500 tax credit, separately where Tesla’s large-type pricing already deducts it.

Mach-E Select, $44,995 (including $1,100 shipping), due early 2021, and rear or all-wheel-drive. DC fast charging is up to 115kW; the others charge at up 150kW.

Mach-E Premium, $51,700, late 2020, and rear or all-wheel drive.

Mach-E First Edition, $61,000, all-wheel-drive only, with a limited edition Mach-E delivered in late 2020.

California Rt. 1 Edition, $53,500, early 2021, the extended-range battery only, rear-drive only, and equipped for a range of up to 300 miles.

Mach-E GT, $61,600, spring 2021 (the last Mach-E to ship), the highest performance Mach-E, all-wheel-drive only, the extended battery only, and capable of 0-60 mph acceleration of around 3.5 seconds. Information on packages will be released later. This is the one Mach-E that has acceleration on par with the higher-end combustion-engine Mustangs.

Every Mach-E comes with FordPass Connect (onboard telematics), Phone as a Key (PaaK), 15.5-inch touchscreen, a 10.2-inch LCD instrument panel, and Sync 4.0 infotainment. The cars will also have three personas — “whisper,” “engage” and “unbridled” — that use fewer or more graphics and bright colors, depending on the driver’s mood or driving style.

Evasive steering assist adds steering effort to the driver’s inputs in situations where sensors detect a possible collision that can be avoided by turning the wheel.

Solid Driver Assists, Level 2 Self-Driving

The Mach-E offers significant safety features in three packages called Ford Co-Pilot which a) gives Ford a single distinguishing term for its advanced safety and driver-assist features and b) confuses the heck out of buyers because it all sounds alike. Here’s our attempt to make sense of it all. Basically, you get everything on the middle three trim lines, Premium, First Edition and Route 1, and you don’t get the surround camera/parking assist Co-Pilot package on the cheapest, Select, and sportiest/most costly, GT. The confusion aside, Ford provides an exceptional level of safety features standard.

Ford Co-Pilot 360 2.0 is the basics: pre-collision assist with automatic emergency braking, pedestrian detection and braking, blind spot detection (Ford terminology: blind spot information system) with rear cross-traffic alert, lane departure warning, auto high beams, post-collision braking, reverse sensing (rear sonar), reverse brake assist, and rear-view camera (mandated by the feds).

Co-Pilot360 Assist 2.0 adds driver assists: Intelligent Adaptive Cruise Control that is full-range (stop-and-go) lane centering (that is, beyond lane departure warning and beyond lane keep assist), speed sign recognition, and evasive steering assist.

Ford Co-Pilot360 Technology has Active Park Assist 2.0 that finds and parallel parks (also backs into perpendicular-parking spaces) and a 360-degree camera system.

As self-driving software gets better, Ford says it may be possible to do over-the-air updates that give the car additional degrees of autonomy. Where the various Co-Pilots combine to keep the car centered in the lane and a fixed distance behind the vehicle ahead, as long as the driver’s hands are lightly on the wheel, more autonomy might be possible. The Mach-E already integrates an eye tracker that tells if the driver is watching the road. With that hardware in place, the driver might no longer need to keep hands on.

Ford includes a simple charging cable for 120/240 volts, will sell you a 48-amp / 240 volt Level 2 charger (photo) that Amazon will install.

Charging Options, Help From Your Phone

Ford put a lot of effort into making charging as effortless as possible short of robotic arms that plug the charger in for you. The car comes with a Ford Mobile Charger for 120 and 240 volts that adds 3 or 22 miles of charge per hour, respectively. Ford cut a deal with Amazon to supply and install a 240-volt Connected Charge Station that delivers up to 32 miles of charge per hour and this is the one you want for an overnight refill of the near-empty 98.8-kWh, 376-cell extended range battery. Ford expects the majority of charging will be done from home

Additionally, a Ford app for your phone called Power My Trip shows how far you can go with the juice you have left, finds charging stations and charger types, and for long trips, maps out stopping points optimized so you have enough power to get to the next recommended station with power to spare. Then on the final leg home, it can recommend how much (how little) charge you need. Unless you have free charging at work, the cheapest charging is at home.

Ford also has set up what it calls the world’s largest charging network. Ford isn’t building stations that way that Tesla did. Rather it cut deals with many of the largest charging networks including Electrify America, to find available chargers and then set up a single billing process.

Bill Ford Jr., the company’s longtime champion of the environment: Electrification has come to the point that cars can be green and fun to drive.

Mach-E Puts Ford in a Good Place

55 years of Mustang. Now electrified.

Ford spent the last couple of years getting out of the sedan business, which may or may not be a mistake. Millennials may be turning to sedans because you do the opposite of what your SUV- or minivan-loving parents did. Now it’s looking good. The F-Series pickup is closing in on 1 million sales a year. Much of its SUV line has been recently refreshed. Sync no longer sucks. The Lincoln brand is finding its way with the focus on luxury rather than trying to outdo the Europeans on performance; while Lincoln is soaring, Cadillac is having challenges finding a formula that attracts buyers.

Support for change comes from both the CEO, Hackett, and the executive chairman, Bill Ford Jr., great-grandson of Henry, and the company’s longstanding supporter of cleaner cars and a cleaner environment. Speaking at a Detroit business event earlier this year, Bill Ford said, “When we first started talking about electrification, there was this thought that there had to be a trade-off. It was either going to be green and boring and no fun, or really exciting but burn a lot of fossil fuels. Electrification has come to the point that you can do both.”

The Mustang Mach-E may or may no win over long-time Mustang partisans. They’ll still have combustion-engine Mustang coupes for years to come. Now they’ll have the option of a far roomier Mustang that carries five people and a lot of luggage if they want another Mustang in the garage. Meanwhile, by linking electrification to one of its two most iconic models, Ford is changing its perception to being a company that wants to sell electric cars that are fun to drive, roomy, and high performance.

Now read:

November 18th 2019, 1:37 pm

iFixit: Apple Finally Ditched Butterfly Switches in the New 16-inch MacBook Pro

ExtremeTech

Apple has long positioned itself as a seller of premium electronics, and it commands a correspondingly high price for its phones, laptops, and desktops because of it. The MacBook Pro line has won fans, but many of those fans began questioning their allegiance to the brand a few years ago when Apple released laptops with the “butterfly” keyboard mechanism. After years of selling fragile, unpleasant keyboards, Apple has finally changed course with the new 16-inch MacBook Pro. iFixIt has a teardown of the new MacBook keyboard that shows a completely different mechanism that should (we hope) alleviate all the issues that have plagued Apple’s computers in recent years. 

Rumors began percolating several months ago that Apple was finally ready to throw in the towel on the butterfly switch, which has caused the company no end of trouble since its introduction. The idea behind the butterfly switch was to retain a tactile typing experience while reducing travel, and thus making keyboards thinner. It succeeded at those things, but the travel was so low that a speck of dust could render a key inoperable it if got under the keycap. 

Apple machines being Apple machines, the repair for a damaged butterfly switch was to completely remove the top of the case and replace the keyboard. That was a pricey repair, so Apple started offering an extended warranty on MacBook keyboards. Apple attempted to fix the butterfly switch with a membrane in the 2018 revision that could repel some debris. However, users still had issues because it only takes one bit of dust to break the keyboard. 

With the new 16-inch MacBook Pro, Apple has moved to a more traditional scissor mechanism. In fact, the keycaps from the old Magic Keyboard fit on the new MacBook Pro. The new switch consists of two plastic pieces that pivot in the middle. The keys have about 0.5mm more travel, which is significant for a keyboard. The keycaps themselves are 0.2mm thicker this time around. 

The new MacBook Pro hasn’t been around long enough for us to know for certain, but it looks like this will be the first MacBook Pro in years that won’t be taken out by a speck of dust. You will, of course, pay handsomely for the opportunity to get a MacBook with a functional keyboard. The 16-inch MacBook Pro starts at $2,400 with 16GB of RAM, 512GB of storage, and a 9th-gen Core i7 CPU.

Now read:

November 18th 2019, 11:07 am

Video Game Addiction in the World of Warcraft

ExtremeTech

Video game addiction is definitionally a controversial subject, given that there’s disagreement over whether such a thing even exists. If we separate the clinical concept from the colloquial usage of the term, we’re more likely to be able to come to general agreement. Everyone has known someone (assuming you haven’t been the someone) who, at one point or another, spent way too much time buried in a game and way too little engaged with the world around them.

Most recently, discussions on video game addiction and inappropriate player retention techniques have focused on issues like the use of microtransactions and loot crates. Players have a variety of concerns surrounding these issues, including the use of gambling mechanics to generate revenue and increase customer engagement. The question of whether MMORPGs are overly addictive is the sort of topic that was being debated more widely about a decade ago.

It felt nearly retro to see the headline “World of Warcraft Changed Video Games and Wrecked Lives” go by at Vice. It’s a topic with some personal resonance for me. The article describes what happened to several people who describe themselves or their loved ones as World of Warcraft addicts who played and engaged with the game to a much greater degree than was healthy.

Why Are MMO’s Easy to Get ‘Stuck’ In?

Multiple people Vice spoke to identified World of Warcraft as offering a supportive community for various identity issues or life struggles they were going through at the time, even if they often felt that their own relationship with the game had fundamentally been an unhappy one. This dovetails with my own thinking. While I never allowed World of Warcraft to take over my own life, I played a great deal of the game during some tumultuous and difficult years. I participated in the PvP grind that the Vice article discusses and wear my “Commander” tag to this day. I saw people become colloquially “addicted” to WoW, in the sense that WoW became central to their lives. It’s not that everybody quits their job and becomes a full-time player, so much as being able to count on people to show up around 6 PM and hang around until 10-11 PM, 5-7 nights a week.

Vice’s article hints at part of the reason why this happens: community. Players in WoW self-sort themselves into guilds for the purposes of raiding endgame dungeons and (more rarely) for PvP. It’s not uncommon, at this point, for long-time WoW players to have real-world friendships that have transcended the game. While I am not in regular contact with the vast majority of people I played WoW with, I remain friends with a double-digit group of people that I met solely as a result of our mutual travels through Azeroth.

But WoW didn’t just offer a community. It offers a chance to succeed publicly, to be recognized for that achievement, and to feel as though you are making a positive contribution to something larger than yourself. Leading a group of 25-40 people through a series of choreographed fights while they variously alt-tab, argue, bio break, check Thottbot, check YouTube, get distracted, make food, kill random trash, and occasionally kill bosses felt like an achievement at the end of the night, especially if you’d refrained from throttling the guild leader after he speculated that we should just give all the caster DPS loot to mages by default in the middle of a raid.

It wasn’t always this bad. It just often felt that way. Image by BlizzPro.

Most of the addiction stories that Vice recounts are from earlier in WoW’s history, when the game required a significantly higher time commitment than it does today. That’s one reason why the second parts of my WoW leveling comparison haven’t appeared yet. It’s not that I haven’t logged time in-game, it’s that the amount of time required to level goes up substantially in Classic, while Retail remains a comparative sprint. One of the reasons why people used to log more time in WoW is that WoW used to require it in order for you to be truly successful. In the Classic era, you couldn’t make enough gold from raiding to support the cost of raiding in an endgame progression guild — you had to play the game on top of that.

It’s this combination that I think made WoW (and MMOs in general) “addictive” in ways that a single-player RPG like Skyrim or Dragon Age: Origins really isn’t. First, you’re interacting and socializing with a group of friends you’ve generally chosen to play with. Second, you’re actually working at something that requires some dedication and commitment to succeed. You didn’t have to be a great player to wind up in endgame gear, but a good progression guild had standards and demanded that people show up on-time with buffs on and ready to play. Maybe your clever use of Divine Protection kept the main healer alive when a fear caught the main tank off-guard. Maybe you knew how to stance dance to maximize rage generation. Maybe you were the Druid with a fast-fingered battle rez or the hunter who could always be relied on to handle an add or take Drak for a walk. Maybe you’re the rogue tank who makes snarky comments about how well you can hang with plate wearers and then winds up splattered down the side of Blackrock Mountain.

(I never said all these comparisons were going to be complimentary).

Regardless of the role you played, being good at WoW offered social interaction and validation in a way that a single-player game doesn’t. It was difficult enough to “feel” like work in some psychologically important ways, without being so hard as to represent a challenge on par with the pits and snares of everyday interaction.

Should Blizzard Have Made WoW Less Addictive?

It would be a mistake to pretend people weren’t asking this question 10-14 years ago. Everquest had already been nicknamed “Evercrack” before WoW hit the scene. Nor do I recall Blizzard taking enormous pains to help people disconnect from WoW, though there were a few hint messages that would pop up in-game from time to time reminding players to take a break and only enjoy the game in moderation. It’s worth asking, yes, if Blizzard could have done more than it did. But it’s also worth remembering that MMOs were a lot newer than they are now, with smaller playerbases. World of Warcraft was the game that popularized the MMO genre like no other title ever had.

It’s easy to forget now, but in 2004, Blizzard was the new kid trying to break into the market. I still remember reading head-to-head comparisons of WoW versus Everquest 2, some of which predicted that WoW would be the game to fall by the wayside as Everquest 2 took over EQ1’s built-in player base. The other facet of the conversation that’s easy to lose is that World of Warcraft was hailed at launch for requiring less grinding and being more accessible to individuals than any MMO had been before. Blizzard set out to make a game that was easier for people to play, with less frustrating roadblocks and readily accessible fun.

Vice brings up the WoW PvP grind as an example of a place where the game took a catastrophic wrong turn, and I’ll tell you, they aren’t wrong about that. The PvP of original WoW was an extreme grind. Unhealthily so. But it’s also a system that Blizzard modified even before The Burning Crusade launched, before dumping it altogether. This aspect of the game was gone, never to return, by 2007. Classic WoW might recreate that system, but it does so at the explicit request of the player base.

The problem with declaring that Blizzard should have made WoW less addictive in the run-up to launch in 2004 is that it assumes game designers have a perfect understanding themselves of where that line is or that it’s easy to parse such information from the mountains of player feedback that millions of first-time MMO players were churning out. Blizzard was attempting to build a game that would meet the expected requirements of players who wanted a “hardcore” experience against those from “casual” players that wanted the game to be more accessible. The other term for casual players was “filthy casuals,” which gives you some idea how much these two groups of people got along with each other. From its launch, however, WoW moved consistently in one direction — towards making it easier for people to play for smaller amounts of time.

I think it’s much fairer to criticize WoW and Blizzard for the degree to which they acknowledged that some WoW players did have problems with addiction as the game moved forward, or even to argue that the game could have included features intended to check on players who were playing too much as time went on. At the same time, virtually every change Blizzard has made over the past 15 years has been aimed at making it faster to play WoW. The PvP grind? Gone for well over a decade. Reputation grinds? Much faster. Unlocking major abilities like flying and mounts? Happens much quicker in the game. You don’t even need to visit trainers to learn skills any longer. The game has been categorically overhauled to make it faster and easier to play, and these changes haven’t all been dumped in at one point — the game began evolving in this direction in 2004 and it hasn’t really stopped at any point.

Ultimately, I found the Vice piece somewhat frustrating — not because I doubt that WoW had a negative impact on people’s lives, but because it doesn’t really engage with the fact that World of Warcraft was, ultimately, a product of its time and designed the way it was in part to meet the demands of its own player base. It doesn’t engage with the fact that much was learned in the gaming industry as a result of World of Warcraft or that the game as it exists today is, in a very meaningful sense, not the title that it was in 2004. It doesn’t really engage with the difficult question of how to help people who get hooked into video games (or any other form of entertainment). It doesn’t touch on the fact that compared with the modern era’s loot crates and microtransactions, the idea that people would spend insane amounts of time grinding Dark Iron rep or killing mobs in PvP purely for in-game rewards comes off as rather quaint and self-assured.

That’s one thing about WoW Classic that I intended to put into its own article but that I’ll pull out for this one. WoW Classic is confident in its willingness to ask you to spend time doing things. It doesn’t ask you to buy microtransactions to speed things along. It doesn’t advertise the ability to purchase six Onyxia Death Tokens to get more chances at her loot drops on your next kill. It just… takes a while. Far from trying to climb in your pockets and rifle them for spare change, WoW Classic says, “This is going to take a while. Let’s have some fun along the way.” World of Warcraft Classic is striking to play partly because it exists in an era before game developers treated player time investment as a monetizable commodity.

There are more nuanced ways to explore the question of why people are drawn into MMO worlds than Vice has engaged in here — questions that range beyond the mechanics of the game and explore the social aspects that draw people together. Some of the people Vice spoke to clearly touched on these issues with discussions of identity and finding like-minded communities of players. The topic is more complicated than this treatment ultimately addressed.

Now Read: 

November 18th 2019, 8:50 am

At a Glance: Microsoft Xbox Elite Wireless Controller Series 2 Review

ExtremeTech

Microsoft’s new Xbox Elite Controller quickly became a popular option among gamers when it first launched in 2015. These controllers have reportedly not held up well over the years, though. Microsoft is looking to improve over its original design with the new Microsoft Elite Controller Series 2.

Design

The new Xbox Elite Controller Series 2 is remarkably similar to the original Xbox Elite Controller. The button layout is essentially the same as on any Xbox One controller, except on the underside of the controller where there’s some additional hardware.

The Xbox Elite Controller Series 2 features four curved programmable paddles. If you don’t want to use these, or you feel they are in the way, you can pull them off and toss them aside, but that would remove one of the controller’s major selling points.

Just like the original, the new Xbox Elite Controller Series 2 has detachable and customizable analog sticks and a removable directional pad.

The main feature enhancements inherent in the new Series 2 controller include improved rubberized grips and a built-in rechargeable battery that reportedly can last for up to 40 hours, according to Microsoft. Our sister site PCMag tested one of these controllers and did indicate the new grips felt like an improvement over the original.

Conclusion

Microsoft set the Series 2 controller with an MSRP of $179.99, which makes it $30 more expensive than its predecessor. As the new controller features better grips and a long-lasting built-in battery, the new Xbox Elite Controller Series 2 appears to be a superior solution compared with the original. If I was in the market for a high-end controller, however, I’d probably wait a while longer to see how the new Series 2 controllers hold up before sinking $179.99 into a high-end model.

Now read:

November 18th 2019, 8:50 am

ExtremeTech Presents: Ask the Grumpy PC Technician

ExtremeTech

PCs are complicated. Sit down to troubleshoot one, and you’re signing up for an experience that runs a gamut between “Two minutes of Googling and I’m off to play Fornite,” and “Over the past 36 hours I’ve drunk 17 energy drinks, slept six minutes, mapped every single unlabeled UEFI option, and developed a niacin allergy.” Machines of the modern era may be more stable than their counterparts of yesteryear, but that doesn’t mean they don’t break.

It’s useful to have someone to talk to at a time like that. Someone who hears you. Someone who understands.

Unfortunately, you people are stuck with me.

Once upon a time, many pounds ago and when I still had hair, I owned my own PC-service-and repair-business. My work as a reviewer and journalist for the past 18 years has always kept me around the latest and greatest from the likes of Intel and AMD, and I’ve done general troubleshooting for readers, co-workers, and friends on an ad hoc basis for my entire life.

This has convinced some of my friends and family that I like fixing computers. As a general rule, I do not like fixing computers half as much as I think I should like setting them on fire or blasting them out of a cannon, but the Powers That Be didn’t think AMD or Intel would be happy if I started benchmarking CPUs based on factors like stopping power, air resistance, or  armor penetration. So, fine. Since I do like helping people troubleshoot problems, I guess we’ll fix things instead. Plus, as an added bonus, I get to talk about things a little more conversationally than I typically would during a story.

We’ll try to avoid this sort of thing. ;)

Got a hardware problem you can’t solve, a question about what to buy, or a topic of interest to discuss? Ask me or put a comment below.

Right now, the goal is to gather responses up during the week and run a weekly Q&A, though we’ll see how things evolve. Troubleshooting hardware can take a few steps and some back-and-forth discussion, so we may have to play with the format a bit to see how we can best show that process. The goal is to create a useful resource for exploring issues and problems other folks may be having as well. Questions about hardware-adjacent topics are also welcome, though my attempts to troubleshoot your furnace may end in mutual tears and recrimination.

Why am I the grumpy PC technician? Because having to deal with broken PCs is intrinsically annoying. I’m much more a Windows user than Mac or Linux, but if you have a question about these platforms I will try to help you find an answer.

Questions? Comments? Drop me an email at GrumpyPCTechnician@Gmail.com or leave a comment here.

November 16th 2019, 10:04 am

ET Weekend Deals: New Echo Dot + Free TP-Link Smart Plug For $45, Bose SoundSport Just $34, Save on

ExtremeTech

Amazon’s pushing hard to grow its client base by offering discounts on several of its online services. You can also pick up one of Amazon’s new Echo Dot devices with a built-in clock bundled with a TP-Link smart plug and a large discount.

Amazon Echo Dot w/ Clock + TP-Link Simple Setup Smart Plug ($44.99)

This new Echo Dot from Amazon is essentially the same as the company’s 3rd gen Echo Dot, but it comes with a built-in LED clock that can also display the current temperature and work as a timer. This bundle also comes with a free smart plug from TP-Link, which can be used with the Echo Dot to turn a connected device on and off using Alexa voice commands. These items would typically cost $82.97, but as part of the bundle on sale, you can get them for just $44.99.

Featured Deals

Check out more deals from TechBargains.

Amazon Devices

More Amazon Device Deals here.

Apple Devices

More Apple Deals here.

Laptops

More Laptop Deals here.

Desktop Computers

More Desktop PC Deals here.

Monitors

More Monitor Deals here.

Networking, Storage and Components

More Networking, Storage and Component Deals here.

HDTVs & Home Entertainment

More TV Deals here.

Electronics

More Electronics & Tech Deals here.

Headphones, Speakers & Audio

More Headphone and Audio Deals here.

Tools & Home Improvement, Kitchen Gadgets, and more

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

 

November 15th 2019, 3:48 pm

Huawei’s Mate X Foldable Phone Launches at $2,400

ExtremeTech

Huawei continues to work on devices like foldable phones, even as its devices are all-but-shut-out of the US market.

The age of foldable phones is upon us, whether you like it or not. Just a few days after Motorola unveiled its revived foldable Razr, Huawei has finally launched its first foldable phone, the Mate X. However, it’s only on sale in China without Google apps. Any plans to launch the phone internationally would no doubt be hampered by the ongoing US export ban, which prevents Huawei from using Google apps. 

Huawei unveiled the Mate X at Mobile World Congress in February, just weeks after Samsung first showed off the Galaxy Fold. The two devices have similar use cases, but Huawei’s design uses a single outward folding screen. The Galaxy Fold has a large inward folding screen and a small external screen. The Mate X is a more attractive and elegant piece of hardware, but we have no idea if it’s durable. 

Samsung famously delayed its Galaxy Fold launch earlier this year following a spate of review unit failures. That’s around the same time Huawei paused its own foldable launch plans. Samsung spent months redesigning its device, and the final product still seems rather fragile. The Mate X has a 6.6-inch display on the front and a 6.38-inch display on the back. When open, they become a single 8-inch OLED. However, the screen is still plastic and is exposed to damage when folded, unlike the Galaxy Fold. 

Unlike the Galaxy Fold, the Mate X has one large screen that folds outward. It also lacks Google apps.

Huawei uses no US technology in the Mate X. It runs on a Kirin ARM chip designed by Huawei subsidiary HiSilicon, which is common for Huawei phones. The Mate X adds a brand new Balong 5000 5G modem, making it the first 5G foldable phone. The other specs include 8GB of RAM, 512GB of storage, and a 4,500mAh battery with 55W charging. 

Huawei is still on the US Commerce Department’s “Entity List,” which prevents it from licensing technology from US firms. Despite claims that the government would issue exemptions to some companies, Huawei is still cut off from the US. That means no Google apps on the Mate X, but it wouldn’t have those in China anyway, and that’s the only place you can buy the device so far. Huawei wants a whopping 16,999 yuan for the Mate X, which works out to around $2,400; some sites are reporting it has already sold out.

Now read:

November 15th 2019, 3:48 pm

Save 88 Percent On A Lifetime Subscription To This Book Summary App

ExtremeTech

Everyone wants to be smarter, there’s no doubt about that, and reading is one of the best ways to broaden your range of intelligence. However, not everyone has the time to dedicate to reading tons of different books when they have a ton going on in their lives. Thanks to the Readitfor.me, you can access hundreds of summaries and save yourself a ton of time.

This plan allows you to stay ahead of the curve with its massive library of over 300 book summaries. Roughly 100 summaries are added every year as well, so you’ll constantly be able to access the latest and greatest. All of the summaries included in the membership were carefully chosen as some of the most valuable problem-solving ones out there. You’ll learn things related to management and professionalism, having tough conversations, problem-solving, and productivity along with plenty of others. Your access will last a lifetime as well, so you can access these summaries whenever you want a refresher on a certain one as well.

The Readitfor.me Standard Plan Membership is on sale for 88 percent off, costing you only $99.99 for lifetime access to the best book summaries out there. Start building your wealth of knowledge with a membership and save time in the process.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 15th 2019, 3:05 pm

Newegg Tips Black Friday Deals on Gaming Laptops, Desktops, More

ExtremeTech

Online electronics retailer Newegg just released its Black Friday 2019 ad, which tips steep discounts on laptops, desktops, monitors, smart home devices, TVs, and more.

Newegg will be offering deals the whole week of Thanksgiving and Cyber Monday. Its Black Friday sale officially kicks off Monday, Nov. 25 at 12 a.m. PST. The company promised a fresh batch of deals on Wednesday, Nov. 27 and Black Friday, Nov. 29. Then, its Cyber Monday sale will begin Sunday, Dec. 1 and run for one week.

If you’re in the market for gaming gear, Newegg is already selling a 15.6-inch MSI gaming laptop (ninth-gen Intel Core i5, Nvidia GTX 1650, 512GB SSD) at the Black Friday price of $599 ($200 off after $100 instant savings and $100 mail-in rebate). The company also plans to offer the Skytech AMD Ryzen 5 2600 (AMD RX 580, 8GB of RAM, 500GB SSD) gaming desktop for $639.99 ($110 off) and a 24-inch Acer 144HZ 1MS FHD gaming monitor with built-in speakers for $149.99 ($50 off).

Newegg has plenty of other PC deals lined up, including the 15.6-inch Lenovo IdeaPad L340 laptop (AMD Ryzen 5 2nd Gen, 8GB of RAM, 256GB SSD) for $399.99 ($200 off) and the 11.6-inch Lenovo 100e Chromebook (2nd Gen, 4 GB of memory, 16 GB eMMC SSD) for $99.99 ($130 off).

If you’ve been eyeing the popular Samsung Frame QLED TV, you’ll be able to grab the 43-inch model for just $797.99 ($502 off) or the 65-inch for $1,597.99. Newegg will also have deals on PC components, accessories, networking gear, robot vacuums, headphones, speakers, unlocked phones, game consoles, printers, and more.

“This holiday shopping season we’re making sure to give our customers great opportunities to save on the gifts people want most,” Newegg’s President of Global Sales Anthony Chow said in a statement. “Extending this year’s sales event gives people the flexibility to shop when it’s most convenient, ensuring plenty of time to spend with family and friends over the Thanksgiving weekend.”

Here’s some more of the best Black Friday deals you’ll find at Newegg this year:

For more, check out Newegg’s full Black Friday eFlyer.

This post originally appeared on PCMag.com.

November 15th 2019, 3:05 pm

Google Rolls Out Silent Chrome Experiment That Breaks Enterprise Setups

ExtremeTech

Google’s Chrome browser has always had a very quick update cycle, but there are multiple release channels if you don’t want to live on the cutting edge and deal with bugs. Google also lets IT administrators control updates on their networks. However, the browser maker ruffled some feathers this week when it rolled out an experimental feature that broke Chrome on many remote enterprise environments like Citrix. 

IT staff began noticing issues early this week when thousands of machines started showing blank white tabs in Chrome. Users were unable to access the browser at all, causing work in some offices to grind to a halt. It was unclear what caused the widespread issues at first as most IT administrators control when Chrome updates roll out inside their network. Google, however, took the liberty of enabling a new feature on a subset of machines, even those on managed networks. 

The culprit, it seems, is a feature called WebContents Occlusion. Google designed this feature to reduce resource usage when Chrome tabs aren’t visible. For example, if you move another window over top of Chrome, the browser can suspend your tabs and revive them as soon as they’re visible again. At least, that’s the way it’s supposed to work. 

According to Google Chrome engineer David Bienvenu, the WebContents Occlusion flag has been in beta for about five months. Google turned the feature on for about one percent of stable users for a month, and it didn’t get any complaints. It began pushing WebContents Occlusion to more devices on Tuesday morning, which kicked off the trouble. 

It seems that the feature doesn’t understand the way remote access clients like Citrix and Microsoft RDP render the browser. So, Chrome thinks there’s something on top of it and suspends tabs. People began calling this the “White Screen of Death” in a nod to the infamous blue screen of death that indicates a Windows crash. That’s a big problem for companies that have thousands of workers on Citrix who rely on the Chrome browser. 

Google rolled back the change on Thursday after lengthy discussion threads popped up on Google’s support forums. IT workers are understandably annoyed that Google pushed a silent feature update to stable devices that broke everything. In many cases, companies spent hours attempting to troubleshoot their own setups before realizing it was a widespread Chrome issue. Hopefully, the Chrome team is more careful with flipping flags in the stable version going forward.

Now read:

November 15th 2019, 2:35 pm

2020 Nissan Maxima Review: The Sporty Reason People Still Buy Sedans

ExtremeTech

The Nissan Maxima sedan has been around forever, since 1981, and you forget what a nice car it is, especially with its recent refresh that now includes the Nissan Safety Shield driver assists. For around $40K, you get a very quick, good-handling sedan capable of hitting 60 mph in six seconds. The car is stunning in profile and the cockpit is nicely trimmed.

Against that, Nissan ProPilot Assist self-driving is not available on a car meant for long highway drives. Also, the same low roofline that makes the Maxima sleek makes it snug in back for adults. The ride qualities the driver calls “sporty” the passengers may call “firm.” Still, the current Maxima compares well against premium luxury sedans. You pay less for a loaded Maxima than a stripped BMW 5 Series or Lexus GS.

Maxima on the Road

The Maxima is a blast to drive, the 300-hp V6 is amazingly powerful, and long treks are pleasant thanks to good seats and excellent audio.

My cockpit on the Maxima Platinum had an orange-and-black interior that sounds like Halloween, maybe Thanksgiving. But it’s actually quite fetching, especially if you like sporty looks. The D-steering wheel with the squared-off bottom is what many racecars have and it’s also a godsend for drivers with ample waistlines. The center stack display, starting to feel a bit small at 8 inches diagonal, does have real buttons on either side, a big plus.

While it’s a front- not rear- or all-wheel-drive car, the Maxima felt competent on twisty back roads. Unless you’re doing a car club lapping day at a race track, front-drive is good enough for virtually all occasions.

Nissan Maxima Platinum with the orangish (but in a nice way) Rakuda Tan quality leather trim.

Maxima Stands Out of Safety

Unlike many competing cars, the Maxima has a solid set of safety features on all trim lines, not just the ones that cost more.

For 2020, Nissan makes Nissan Safety Shield 360 standard on all six trim lines: lane departure warning, blind-spot detection/rear cross-traffic alert, automatic emergency braking/pedestrian braking, automatic high beam control, and rear automatic braking.

Adaptive cruise control is not part of Safety Shield 360 but it’s on all but the entry trim, which typically accounts for less than 10 percent of model sales. The surround-view camera system is on three of the six trim lines.

Also standard across the line in 2020 is the Integrated Dynamics (control) Module (IDM): intelligent trace control (maintains the cornering line as steered), active ride control (modulates engine power and brakes under acceleration and braking to minimize pitching motions), and intelligent engine brake (small amounts of engine braking when using the brakes to smooth deceleration).

Long trips would better still with ProPilot Assist, but when Nissan did the midlife refresh of the eighth generation Maxima for 2019, it didn’t upgrade from mechanical to electric power steering, a requirement for cars that self-drive or lane-center. Too bad: ProPilot Assist is a must-have feature on the majority of the Nissan line. Press a couple of buttons, the car settles into the center of the lane, maintains its distance from cars in front, and drives itself so long as it senses your hands lightly on the wheel.

The sloping roofline on the Maxima helps looks at the expense of rear-seat headroom.

2020 Maxima Models

There are six model variants, or trim lines, for the 2020 Maxima. All have the 300-hp V6 engine and continuously variable automatic transmission (CVT). The 2019 Maxima, still available, is essentially the same vehicle.

Maxima S, $35,175 (including $925 freight). It has LED headlamps, Apple CarPlay, Android Auto, satellite radio, Bluetooth, and two USB jacks. The only thing missing off the teaser-price trim is adaptive cruise control. Wheels are 18 x 8 inches with 245/45R18 tires.

Maxima SV, $36,450. The extra $2,200 gets you adaptive cruise control, traffic sign recognition, navigation, leather-trimmed seats, and heated front seats.

Maxima SL, $39,565. Sonar is front as well as rear. Audio is Bose, 11 speakers. There’s a panoramic moonroof and a heated steering wheel.

Maxima SR, $42,375. This is the sporty version with paddle shifters, a sport-tuned suspension (read: stiffer than the already firm shocks/springs on the others), 19 x 8.5-inch low profile tires, and vented front seats. The 2019 SR Premium Package of the panoramic moonroof and the surround camera system (Intelligent Around View Monitor) is now part of the base price.

Maxima Platinum, $42,565. It’s the next step up from SL, not the SR. It gets Nissan Connect telematics, 18 x 8-inch alloys, quilted-leather seats, and birdseye maple trim.

Maxima Platinum Reserve, $43,705. A new trim line for 2020, it adds heated rear seats, different seat leathers and trim, and a charcoal headliner.

Should You Buy?

Nissan has two midsize sedans, the Maxima and the Nissan Altima (out since 1993). The Altima is a mainstream, front-drive sedan meant to compete with the Accord, Camry, and Sonata. Some competitive analyses compare the Maxima with a full-size sedan along with the Toyota Avalon and Chevrolet Impala.

Look at the specs for the rear seat for Maxima, Altima, and Camry, the best-selling midsize sedan:

Maxima: 34.2 / 35.8 inches (headroom / legroom)

Altima: 35.1 / 36.9 inches

Camry: 38.0 / 38.0

That makes the Maxima okay, not great, for carrying two additional adults. And if four of you are going away for the weekend, pack light. It has 98.5 cubic feet for passengers plus 14.3 for cargo, for a total of 112.8. The sibling Altima is three inches shorter and has 117.3 plus 15.4 cubic feet of capacity. So Maxima is midsize and more of a competitor to Toyota Camry, Subaru Legacy, Kia Optima, Mazda6, Honda Accord, Hyundai Sonata, and VW Passat. None have the acceleration of the Maxima, although the Mazda6 (especially), Accord, and Passat compare favorably on handling and fun-to-drive.

For fun-to-drive, you want the Maxima SR sporty model. And if you want to stand out, consider the $395 Sunset Drift Chromaflair orange exterior paint. That gives you 240/40R19 tires and wheels that don’t get along well with potholes. I test-drove a Maxima Platinum and the ride was firm; the SR will be firmer.

If you like the sporty looks without the stiffer ride, the Maxima SV has virtually all the safety you want. Moving up the trim lines gets you better audio with SL and then surround view with SR and the Platinums.

Nissan sells a bit less than 50,000 Maximas a year, so there’s a certain exclusivity that way. It is worth a test drive if you want sporty looks and handling and if your back seat passengers don’t mind a snug, but not cramped, space. The Maxima’s biggest drawback for long-distance cruisers is the unavailability of ProPilot Assist.

Nissan deserves credit for sticking with sedans at a time when many mid- and full-size competitors, most with decent offerings, are going away: the Ford Taurus, Chevrolet Impala, Buick LaCrosse, and Cadillac XTS, among others. Among the mid-price, midsize sedans that want to be seen as class-above, the Maxima may well be the best choice if you want sporty and reliable. Buyers who want a roomy back seat and trunk may want to look elsewhere, including Nissan’s own Altima in one of the higher trims, where it can be had with all-wheel-drive and Nissan’s intriguing variable compression engine.

Now read:

November 15th 2019, 11:17 am

Razr Sharp: Motorola’s Retro-Futuristic Foldable Phone Praised

ExtremeTech

When Motorola started teasing the idea of a retro foldable phone based on its iconic Razr, it seemed more like a tease than an actual product. Motorola hasn’t been a major competitor in the high-end phone market for years, but the company was going to ship a foldable display — a technology Samsung was clearly struggling to perfect?

The answer, from pretty much everyone who saw the device today is: Yes. Yes, they are. Sascha Segan of PCMag is extremely enthusiastic, writing: “The Razr puts the Samsung Galaxy Fold and Huawei Mate X a bit to shame. Those thick, clunky phones feel like technology demos for folding-screen technology—a material in search of a gadget in which to put itself. The Razr, on the other hand, feels like an idea that was waiting for, and finally found, its material.” Techradar writes: “The Motorola Razr 2019 does an admirable job of reviving an iconic design in a new foldable form factor that defines what it means to stand out – yes, the phone everyone had 15 years ago is now something to make you stand out.”

Not everyone is quite as enthusiastic as these two publications. DigitalTrends writes that it was left “wanting more,” while CNN only qualified the handset as a “kinda want.” Since these reports are coming from people who spent some time hands-on with the product but don’t represent final reviews, I’m going to do a bit more synthesis on the opinions on the product rather than straight quoting.

Image by Sascha Segan, PCMag

First off, the Razr has everybody talking — even folks who aren’t necessarily wowed by the device as it exists today. The long, narrow design folds into a diminutive palm-sized device. There’s a 2.7-inch screen on the outside that works like a good flip phone display — you can use it for notifications, photos, and responding to calls. Opening the phone gets you a 21:9 aspect ratio and a tall, narrow device. The display doesn’t offer more screen real estate than you’d typically get from a non-folding phone, and the 21:9 aspect ratio is unusual in this kind of handset. If you read a lot of mobile content, you might love the layout, especially for long-form text. Techradar notes that a fair bit of video content leaves bars on either side of the display, however.

The display is a plastic-covered OLED and doesn’t have a visible crease, unlike Samsung. The OLED panel is a 2142×876 display (Techradar claims it’s much less sharp than an equivalent 21:9 Sony panel, but no other review I’ve seen complained). The specs on the phone are modest, on the whole — it’s based on the Snapdragon 710, Qualcomm’s midrange mobile chipset from 2018. The 710 is said to feature a pair of Cortex-A75 CPU cores and six Cortex-A55 cores, with 6GB of RAM, 128GB of storage, and a tiny 2,510mAh battery. The front camera is 16MP and f/1.7, the front is 5MP and f/2.0.

The people who like the Razr like the way you can seamlessly switch between the smaller and larger displays, the 21:9 display ratio (in some cases), the fact that it fits in a pocket again, and in some cases, the visceral feel of opening and closing the handset. The folding screen is praised for not creasing, though the technology is new enough that everyone wants to see durability tests. The Verge declares that the hinge is the current high-water mark of design for foldable phones.

Image by Sascha Segan, PCMag

On the negative side, there’s the fact that it costs $1,500, packs midrange internals, has a weak camera, and fields a lot of new technology of uncertain durability. The battery is small. It’s qualified as a first-generation product in a lot of reviews, and it very much is.

Based on the excitement of some of the press, it’s clear that Motorola has built something genuinely interesting here. Whether that something is enough to get over the various barriers between foldable devices and the mainstream market is a different question. Motorola will need to demonstrate that it’s ready to field this product into market with the best of what Samsung and Apple can ship, and prove it can iterate and improve the design. Different publications come down on different sides of how appealing the first-generation device is at this point in time — and that, I think, is why you see differing opinions.

The phone will be a Verizon exclusive in the US, with pre-orders starting December 26 and deliveries sometime in January.

Feature image by Sascha Segan, PCMag

Now Read:

November 15th 2019, 9:47 am

Google Will Bypass Carriers, Deploy Own RCS Chat System

ExtremeTech

OnePlus 3 and 3T

OnePlus usually only releases one flagship smartphone per year, but this time it released two. They're basically the same phone, though. The OnePlus 3 had a Snapdragon 820, 64GB of storage, and 4GB of RAM. The OnePlus 3T came along in late 2016 with a Snapdragon 821 and a larger 3400mAh battery. They're both extremely fast phones and have fantastic fingerprint sensors, but yes, the 3T is slightly faster. These phones also have the best cameras you can get in a phone that costs around $400 (the 3T is a bit more expensive at $440). The 5.5-inch 1080p AMOLED isn't the best I've seen, but the other hardware features make up for that. The build of Android 6.0 is free of clutter and has some useful extras. Nougat is on the way in a few weeks as well. Even with the higher price for the OnePlus 3T, this is one of the best deals of 2016.

Google has decided to deploy the Rich Communication Services (RCS) protocol on its own after waiting for years for carriers in the US to do so instead. While SMS has been a phenomenally successful message protocol, it dates to the mid-1980s. RCS is intended to be a replacement, but its deployment has been minimal. While Google initially helped create the standard, it relied on the US wireless carriers to roll it out — and they haven’t been particularly interested.

Last month, the carriers announced that they would finally roll out RCS… only they weren’t going to use the actual version Google has worked on. Supposedly both Google’s RCS and the upcoming carrier RCS will both support a feature known as Universal Profile, which would allow both services to interoperate — but we won’t know if this will actually happen until the carrier service is deployed.

Google, therefore, is going to deploy its own version of the standard anyway. A Google spokesperson refused to comment on rumors that blackjack and hookers might be involved. What does it mean for Google to roll out these capabilities itself through Messages? The company states that it’s upgraded chat features over mobile and Wi-Fi data. Google writes:

When you and your friends message each other with these chat features, you can chat over Wi-Fi or mobile data, send and receive high-resolution photos and videos, and see if people have received your latest messages. Plus, you’ll get better group chats, with the ability to name groups, add and remove people to and from groups, and see if people haven’t seen the latest messages.

If these seem basic for chat functionality, remember, we’re talking about a replacement for SMS, which is an incredibly basic standard to start with. Upgrading the most basic communication protocol used for text messages is a step forward, even if other applications offer more advanced capabilities. RCS includes typing status, location sharing, longer messages, read receipts, and media support. But it doesn’t include end-to-end encryption, and it’s tied to your phone number as opposed to any other identity.

Google Messages.

Google Messages isn’t the default messaging app on most Android devices, but if you download it and enable the Chat Features capability, you can experiment with the RCS protocol when messaging other Messages users.

It’s not clear exactly what the play is, here. RCS is a carrier-centric messaging service that Google is moving to deploy without carrier buy-in, while T-Mobile, AT&T, Sprint, and Verizon go off and build their own RCS service that’s like the one Google built, but distinct from it. And while all of this is taking place, Google is still planning to kill Hangouts — easily its most-used chat service with the largest user-base.

In short, Google’s entire messaging strategy remains confused, its technology deployments are poorly coordinated, and we’d really like to just keep using Google Hangouts please, separate from any other decisions the company might make around messaging.

Now Read:

November 15th 2019, 9:47 am

Researchers Use 3D Climate Modeling to Estimate Planet Habitability

ExtremeTech

This artist’s impression shows the planet orbiting the Sun-like star HD 85512 in the southern constellation of Vela (The Sail). This planet is one of sixteen super-Earths discovered by the HARPS instrument on the 3.6-metre telescope at ESO’s La Silla Observatory. This planet is about 3.6 times as massive as the Earth lis at the edge of the habitable zone around the star, where liquid water, and perhaps even life, could potentially exist.

NASA missions like Kepler and the Transiting Exoplanet Survey Satellite (TESS) have revealed just how many planets there are in the universe. Simply knowing that planets exist doesn’t tell us if there’s anyone living there. When astronomers suggest an exoplanet could be habitable, that’s a very rough estimate. A new study is the first to use 3D climate modeling to help nail down which exoplanets could support life. 

Red dwarf stars (sometimes called M dwarfs) are the most common type in the galaxy — there’s one right next door called Proxima Centauri. You might remember hearing a lot about Proxima Centauri in the last few years because there’s strong evidence for an Earth-like exoplanet there. Indeed, scientists think red dwarf stars host a plethora of exoplanets. While they’re smaller and cooler than our sun, they’re very long-lived and exoplanets could host liquid water if they orbit close enough. 

The new study from Northwestern University researchers combined 3D climate models with photochemistry and atmospheric chemistry, a first for exoplanet research. This research didn’t involve observing exoplanets but rather focused on creating a model that more accurately calculates the effects of solar radiation. The team believes this simulation provides a fuller picture of conditions on the surface of exoplanets, allowing astronomers to prioritize observations on the most likely targets. 

This artist’s impression shows the planet Proxima b orbiting in the Goldilocks zone around the red dwarf star Proxima Centauri, the closest star to the Solar System.

According to the researchers, 3D photochemistry hasn’t been used in exoplanet research because it’s so computationally expensive. Taking the time to do it, however, shows that such techniques can be vital to estimating the heating and cooling effects of solar radiation. Knowing how solar radiation interacts with the gases in a planet’s atmosphere can help differentiate between Earth-like planets and acidic hellscapes like Venus. For example, a planet might be inside the habitable zone of a star, but its thin ozone layer could mean too much ultraviolet radiation on the surface. The simulations also show that planets orbiting active stars are susceptible to losing significant water through vaporization. Without liquid water, life as we know it is impossible. 

This research could help guide future studies of distant planets, particularly when the James Webb Space Telescope comes online in a few years. That instrument will have vastly increased sensitivity compared with Hubble, allowing it to detect water vapor and ozone in planetary atmospheres. Astronomers just need to know which planets are the most likely homes to alien life.

Now read:

November 15th 2019, 8:14 am

SpaceX Successfully Tests Crewed Dragon Launch Abort Engines

ExtremeTech

SpaceX has cleared a major hurdle on the way to launching manned missions with its Dragon spacecraft. The company had to push back its launch plans after the stunning explosion of a Crew Dragon capsule during testing earlier this year. Now, SpaceX has successfully tested the engines without incident, paving the way for a test flight next year. 

The SpaceX Dragon is one of two commercial spacecraft NASA hopes to use to launch manned missions to the International Space Station, the other being Boeing’s CST-100 Starliner. SpaceX was on track to beat Boeing to launch before its April testing failure, but picking through the pieces of the demolished capsule pushed back the timetable. 

After an investigation, SpaceX confirmed the craft’s SuperDraco engines themselves were not at fault. These innovative launch abort engines use hydrazine and nitrogen tetroxide propellants, which mix together and ignite, but most launch abort systems use solid propellants. SpaceX went this way because it intends to do propulsive landings with the Dragon in the future, but NASA hasn’t authorized that for crewed flights. Unfortunately, a leaky fuel valve in the abort propulsion system allowed nitrogen tetroxide to leak into the helium pressurization system. It was then driven back into the titanium check valve, which caused the explosion. 

The new and improved Dragon has a burst disk in the fuel lines that keeps propellant from leaking into the high-pressure lines before ignition. This week’s test-firing demonstrates that the new system functions as intended, and SpaceX says it can now move forward with launch plans. 

The next step is to test the SuperDraco engines in-flight late this year. A Falcon 9 will launch the capsule upward for 88 seconds. At that point, the computer aboard the rocket will cut the engines, simulating an emergency loss of thrust. The SuperDraco engines will wrench the Dragon from its mounting and push it clear of the launch vehicle. It will parachute into the Atlantic where SpaceX will recover it for study. 

Once SpaceX has proven that its spacecraft can handle an in-flight abort, it’ll prepare for the first crewed flight in early 2020. The dragon will send astronauts Robert Behnken and Douglas Hurley to the ISS, remaining there for two weeks before returning to Earth. Boeing is targeting the same timeframe for its first crewed flight.

Now read:

November 14th 2019, 4:20 pm

ET Deals: $400 Off 15-inch MacBook Pro, Asus VivoBook 15 Just $249, Dell PowerEdge T40 $349

ExtremeTech

Today’s you can save $400 on one of Apple’s 2019 MacBook Pro notebooks that comes with a fast Core i7 processor and an AMD GPU.

Apple MacBook Pro Intel Core i7 15-Inch Laptop w/ 16GB RAM and 256GB SSD ($1,999.99)

Apple’s aluminum-clad MacBook Pro comes sporting a six-core Intel Core i7 processor that operates at speeds up to 4.5GHz. The system also has an AMD Radeon Pro 555x graphics processor with 4GB of VRAM for gaming. Currently, you can get it from Amazon marked down from $2,399.00 to $1,999.99.

Asus VivoBook 15 F512DA AMD Ryzen 3 3200U 15.6-Inch 1080p Laptop w/4GB DDR4 RAM and 128GB M.2 NVMe SSD ($249.00)

This budget-friendly computer comes with an AMD Ryzen 3 3200U dual-core processor that operates at speeds of up to 3.5GHz. The notebook is also relatively light at 3.5 pounds and quite thin at 19.9mm tall. Although this system is designed as a lost-cost solution, it comes with a solid feature set including a 1080p display, a built-in fingerprint scanner, and a USB Type-C port. Right now you can get it from Walmart marked down from $349.00 to $249.00.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

 

November 14th 2019, 2:30 pm

Judge Blocks Trump Administration From Allowing 3D-Printed Guns

ExtremeTech

3D printing hasn’t lived up to the hype about a decade after the first consumer-ready printers hit the market. The number of people using 3D printers is still small, but some of them are using the devices to make guns. Last year, the Trump administration moved to settle with a gun printing non-profit and make those weapons legal. However, a federal court has blocked that deal, calling it “unlawful.”

The legal drama stretches back to 2012 when Defense Distributed began designing and posting 3D printing files for gun parts. The following year, it released files for “The Liberator,” a single-shot plastic gun that anyone could make with a sufficiently capable 3D printer. The files were downloaded more than 100,000 times within a day. The government came down on Defense Distributed citing the Arms Export Control Act and forced the removal of the files, but founder Cody Wilson pledged to fight on. 

Although Wilson has since left the company after attempting to solicit sex from a teenager, Defense Distributed has continued its fight against the government. In 2018, the Trump administration decided to settle the case, offering to rewrite federal law to make devices like The Liberator (and semi-automatic rifles like the AR-15) exempt from export laws. Following the deal, 19 states filed suit against the government and secured a halt to any changes in the law. 

The new Washington state ruling pulls no punches discussing the 2018 deal. Judge Robert Lasnik calls the settlement “arbitrary and capricious,” pointing out that the government has an obligation to uphold the law. It cannot simply announce a contrary position without justification, and then change federal law without consulting Congress. 

A disassembled Liberator pistol from the original 3D files that started the mess.

While Defense Distributed was the sole focus of the lawsuit, it wasted no time issuing a statement. Spokesperson Chad Flores decried the judgment, citing the company’s rights to free speech and calling the lawsuit an “indirect censorship effort.” The Justice Department, which was on the losing side of the case, said it is currently examining the decision. Defense Distributed seems determined to continue pushing the case, so we likely have not heard the last of 3D-printed guns. 

Regardless of what happens with the case, the cat is out of the bag. It’s not hard to get your hands on the files to 3D print guns, and the technology will only get more advanced over time.

Now read:

November 14th 2019, 2:00 pm

Crytek Launches ‘Neon Noir’ Ray-Tracing Benchmark With AMD, Nvidia GPU Support

ExtremeTech

Ray tracing currently occupies a liminal place in the gaming realm. On the one hand, the technology factually exists — there are game engines with ray tracing components baked in, Nvidia’s entire RTX push has been built around it, and the next-generation consoles from Sony and Microsoft will both include an AMD-based ray tracing solution. On the other, the lack of support from AMD, need for higher-end GPUs, and the relative newness of the capability means that ray-tracing isn’t driving the game industry as a whole, either. We also don’t know how central ray tracing will be to AMD’s push with future GPUs — it’s not yet clear if the company will bet the metaphorical farm on ray tracing.

One reason for uncertainty from fans is that it isn’t clear which of the three companies entering the market — because Intel will inevitably have something to say on the subject as well — will provide the best ray tracing solution for future games. This is one area where Nvidia’s heavy RTX branding push and discussions of its own dedicated ray-tracing hardware may have muddied consumer perceptions a bit: DirectX Ray-tracing, abbreviated DXR, is Microsoft’s standard for handling ray tracing in modern titles. It does not require any specialized hardware to operate, though hardware acceleration may be used to improve performance. The benchmark is based on a ray-tracing demo that CryTek released earlier this year.

While all of the games to support ray tracing have done so via Nvidia’s RTX-specific optimizations, there’s no reason this must be true, and it’s possible to run ray-tracing workloads at present on both Nvidia and AMD hardware, provided the code is written to be AMD-compatible. That’s what Crytek has done with Neon Noir, and while I haven’t had time to run the test through an array of AMD or Nvidia hardware, having workloads that can run on GPUs from multiple companies is useful no matter what. It is not clear if games that currently support RTX will be retroactively patched to enable ray tracing when AMD GPUs launch that support the capability or if the handful of titles that support the feature currently will remain Nvidia-only shops. But either way, once Big Navi debuts at some point in 2020 there’ll be more interest in seeing what Crytek has built.

According to Crytek, it’s Neon Noir implementation is hardware-agnostic, and “will run on most mainstream contemporary AMD and Nvidia GPUs, whereas other ray-tracing methods are typically bound to GPU solutions with dedicated RT cores. The feature will ship in CRYENGINE in 2020, optimized to take advantage of performance enhancements delivered by the latest generation of graphics cards from all manufacturers and supported APIs like Vulkan and DX12.”

We haven’t heard much from Crytek in recent years. Games like Warface have been transferred to other studios, and titles like Hunt: Showdown seem to have landed with scarcely a ripple in the larger market. After layoffs and restructuring in the mid-2010s, the company has been mostly dormant, with a few VR titles and continued development of projects like CryEngine V. Neon Noir may represent the company’s efforts to put itself back in the spotlight, but a full-fledged agnostic ray tracing demo not explicitly sponsored by either publisher could be a useful thing, if the test proves useful.

Normally, I’d include some test results and comparative data in a write-up like this, but I’m currently tunneling out of the CPU Reviewer’s Pit, using a Ryzen 9 3950X for a shovel. Spoiler: It’s a terrible shovel. Look for new ShovelMark results coming soon (except not really, as both AMD and Intel get really cross if you test chips this way).

Crytek does expect to bake these improvements into the Crytek engine, so games like Neon Noir should be possible in 2020. Details on how to configure the benchmark are available from Crytek directly. If you set LowSpecMode to 0, ray-traced reflections will render at the same resolution as the test itself. If you render with LowSpecMode=1, the ray-traced reflections will render at 1357×763, or roughly 50 percent of 1920×1080 resolution. LowSpecMode is set to 0 if you render at Ultra quality and 1 if you render at Very High Quality. It is not clear if the Very High reflection quality remains rendered at 1357×763 as resolution scales or if the test scales the reflection resolution to ensure it remains 50 percent of the target display resolution. Minimum specs for the demo include a Ryzen 5 2500X or Core i7-8700, AMD Vega 56 or GTX 1070, 16GB of system RAM, Windows 10 64-bit, and DX11. Some of these restrictions are obviously specific to the demo, given Crytek’s comments about supporting Vulkan and DX12 in the future.

The demo can be downloaded here. Keep in mind that current AMD GPUs don’t practically support ray tracing, so while it may be possible to compare performance on these solutions, it’s not clear if they’ll ever actually run ray-traced titles in that mode. AMD will make that determination.

Now Read:

November 14th 2019, 11:12 am

AMD Ryzen 9 3950X Review: This CPU Goes Way Past 11

ExtremeTech

When AMD announced the Ryzen 9 3950X in June 2019, I wasn’t immediately sure what to think. Taking the original Ryzen platform up to 16 cores from the eight that it launched with offered a very significant upgrade path to first-generation Ryzen buyers, but it also raised questions about whether two memory channels would be enough to keep the CPU fed.

AMD delayed the Ryzen 9 3950X by a few weeks, possibly to improve inventory levels, but the company’s 16-core CPU is finally here. The Ryzen 9 3950X is the “full fat” edition of the desktop Ryzen family, with two physical chiplets and an I/O die, with all eight cores enabled on both die.

Here’s how AMD’s Ryzen stack looks these days, with per-core pricing.

At $750 for 16-cores, the 3950X is a fair step up from the $500 price point, and the increase in core count (1.33x) doesn’t quite match the increase in price (1.5x). This isn’t uncommon at the high end of the market, however, and the price to leap from Intel’s mainstream desktop segment to its HEDT platform has always been far higher than this.

The Ryzen 9 3950X is a rare CPU. First, AMD and Intel CPUs with the highest core counts don’t typically field the highest single-threaded clocks at the same time. Second, while there have been points in the past where it was possible to boost CPU core counts over and above what was available when a hardware platform launched, it doesn’t happen all that often. Third, the CPU sits at a unique point in AMD and Intel’s respective product stacks.

In AMD’s case, the 16-core 3950X is the effective replacement for last year’s Ryzen Threadripper 2950X. The old Socket TR4 platform won’t be upgraded with a third-generation chip and the new TR40X third-generation Threadripper platform will start at 24 cores and stretch up to 32.

The Ryzen 9 3950X is also AMD’s attempt to retake a platform advantage it held over Intel in Ryzen’s early days. Until Intel launched the 8700K, AMD had an eight-core CPU sharing space in the mainstream desktop market with an Intel platform that topped out at four cores and eight threads. Coffee Lake reset things a bit closer to parity later that same year, but AMD undoubtedly liked being able to queue up 2x the CPU cores in an apples-to-apples comparison. With the Ryzen 9 3950X, it can do that again.

Finally, the Ryzen 9 3950X is unique in the sense that Intel isn’t lining up another CPU exactly against it in terms of core count. At ~$500, you’ve got chips like the Core i9-9900K. When Cascade Lake-X launches, the 3950X will be bracketed by the Core i9-10920X (12-core, 3.5GHz / 4.6GHz, $700) and the Core i9-10940X (14-core, 3.3GHz – 4.6GHz, $800). Intel isn’t building a 16-core Cascade Lake X, however — the core count and price jump from 14 to 18 and $800 to $1K, respectively.

Watching for Memory Bandwidth Pressure

The big question going into the 3950X’s debut is whether two DDR4 channels can deliver enough memory bandwidth to feed 16 cores. Dual-channel DDR4-3600 provides 57.6GB/s of memory bandwidth, but that only works out to about 3.6GB/s of bandwidth per CPU core. That’s about 10 percent more RAM bandwidth than a single channel of DDR-400 provided back in the day — and that might sound like a recipe for disaster in a CPU like this.

But a lot has changed since the days when single-channel DDR configurations roamed the Earth. Caches are much larger, prefetching algorithms have improved, and the widespread adoption of SMT offers opportunities to hide memory access latencies. It’s not clear if the Ryzen 9 3950X is particularly memory-bandwidth limited, but we’ve spun the Threadripper 2950X up to test the theory. While we can’t do an exact apples-to-apples test without a third-generation Threadripper to evaluate, any benchmark where the 2950X takes an unusual lead could be a sign it’s leveraging additional memory bandwidth.

Test Setup

The AMD Ryzen 9 3950X was tested in the same OS image and Windows 10 SSD configuration that we used for our 7/7 initial review. Chipset drivers were updated to the latest for download on AMD’s site (1.9.27.1033). Windows 10 1903 with the latest updates and patches was used. Our MSI X570 Godlike motherboard was updated to the latest beta UEFI (7C34v17, dated 11-07-2019). This UEFI is based on AMD AGESA 1.0.0.4. Nvidia GeForce driver 430.86 was used to maintain identical configurations with our previous round of testing.

Each of our testbeds had 16GB of RAM except for our Threadripper 2950X and Intel Core i9-9980XE platforms. These two systems used 32GB of RAM to load each DIMM channel. The third-generation Ryzen CPUs were outfitted with DDR4-3600, while the Intel platforms used DDR4-3200. Intel systems generally care less about RAM clock than AMD systems in any case.

The Threadripper 2950X was tested in the MSI X399 Creation motherboard and tested with Precision Boost Overdrive (PBO) enabled. This represents something of a “best foot forward” scenario for Threadripper, but we wanted to give the CPU its best opportunity to shine in comparison with the Ryzen 9 3950X. Using PBO on a Threadripper voids its warranty.

The Core i9-9980XE’s inclusion here warrants some discussion. The Core i9-9980XE is vastly outside the Ryzen 9 3950X’s price range — but this was also true for the Threadripper 2950X versus 9980XE comparison. At the top of the market, we have to compare the parts that exist. Furthermore, the Cascade Lake X family drops in the not-too-distant future, bringing significant price cuts. We don’t know how much faster the 10980XE is compared with the 9980XE, but we can treat this chip as a low-but-rough-approximate of what the 10980XE’s performance should look like.

The point of including the Threadripper 2950X and the Core i9-9980XE is to see how the newer part changes the shape of what has come before, and maybe to get a sneak preview of what the Cascade Lake versus Ryzen performance comparison is likely to look like.

Finally, the graphs here are going to be a bit… lorge.

Background image by Catstradamus. Graph overlay by ExtremeTech

The sheer number of data points we’re comparing makes this unavoidable. Nine CPUs — four Intel, five AMD — is a lot of data to juggle. The Ryzen 7 2700X and Core i7-8086K are retained to give some performance data on last-gen chips. The 9700K gets included because it’s kind of an oddity — slower than the 8086K in some things, but Intel’s overall fastest gaming chip if what you care about is pure frame rate. The 9900K is Intel’s top-end mainstream desktop chip. The Threadripper 2950X is included to check for any signs of memory bandwidth shortfalls now that AMD is putting 16 cores on just two memory buses, and for the most direct 16-core to 16-core comparison. Threadripper’s vastly higher TDP (180W versus 105W) also give it potentially more headroom to work with.

Before we hit the slideshow, here’s our Blender 1.0Beta2 benchmark results.

The Ryzen 9 3900X and the Threadripper 2950X are near-evenly matched, trading wins with each other across the test. The Ryzen 9 3950X consistently renders in 76-77 percent the time of the 3900X. The Core i9-9980XE is overmatched in most tests, though it does win the Barbershop_Interior test by a full minute.

Our other test results are included in the slideshow below. Due to some last-minute confusion of data sets, some of our 7zip and Handbrake data has to be re-checked. I didn’t have time to complete my power consumption analysis, but I’ll be adding all of these later today.

I’ll have more to say about application performance once I can plug in 7zip and Handbrake, but the 3D rendering performance from the 3950X is truly best-in-class. There’s no evidence of bandwidth-related limits as far as 3D rendering, compiling, or Handbrake (while I need to re-check some results, I’m confident that what I saw showed no weaknesses in any area). The Core i9-9980XE still logs some wins, and we see some narrow evidence of Threadripper’s bandwidth superiority in DigiCortex, but the 3950X notches up some single-threaded wins for itself that AMD hasn’t held for a very long time. That’s not everything — but it’s not nothing, either.

Gaming performance is up next. This is often the place where I note that GPU performance varies only a little at the top-end of the market and that the differences between these CPUs are narrow. That’s not quite the case today, however — and while 1080p isn’t necessarily a likely gaming resolution for these CPUs, it’s still worth taking a look through these results to see where these CPUs land.

Gaming performance shows a few surprising wins for the Ryzen Threadripper 2950X, but the Ryzen 9 3950X hits higher frame rates overall and offers more regular performance besides. While it still doesn’t quite catch Intel in lower-resolution tests, the gap between the two CPU families is the smallest it’s ever been thanks to the 3950X. The fact that AMD is delivering performance gains like this in a high-core part is particularly impressive. It doesn’t make the high-core CPU a good choice for people who principally game — but it does mean that you have to give up much less performance in gaming with the 3950X than with previous solutions — including, in a few places, the 9980XE.-

Conclusion

For $750, the Ryzen 9 3950X is an unbeatable CPU. While it’s true that Intel’s 18-core Core i9-9980XE retains a reasonable number of test wins, compare the 2950X with the 3950X to see just how much AMD has improved performance year-on-year. Twelve months ago, the 2950X wasn’t taking any tests off the Core i9-9980XE — the combination of two additional CPU cores and Intel’s higher overall IPC was a barrier second-generation Threadripper wasn’t capable of scaling very often. And up until now, the single-threaded Cinebench results have been Intel’s sole turf. Cinebench is not the be-all-end-all of performance measurement, but these are still tests that Intel has been reliably winning for at least a decade.

The Ryzen 9 3950X is a further uplift over the 3900X and the 3700X in both single-threaded and (obviously) multi-threaded performance. There’s no evidence of any bandwidth constraints in ordinary desktop applications or games — the 3950X is often the fastest Ryzen CPU on the market where gaming is concerned. This should not be read to indicate that I’m recommending a 16-core CPU for gaming — that’s ridiculous overkill, and I very much am not. But the 3950X takes a number of significant tests in this area as well, and in a way that’s unusual for high-core-count chips.

The fact is, Intel doesn’t have a particularly great competitive answer to the Ryzen 9 3950X in this price bracket, and the launch of Cascade Lake in a few weeks isn’t necessarily going to help them in every space. We can assume that Cascade Lake X will be a touch faster than the current Core i9-9980XE, and obviously that’ll improve Intel’s position at the $1000 price point. What it won’t do, however, is provide an equivalent boost for Intel’s mainstream desktop platform, which still tops out at eight CPU cores.

Now, Intel deserves (and will get) credit for slashing the price on its Cascade Lake CPUs, but $1,000 is still more than $750 and an HEDT motherboard is still a bit more expensive than an X570. Figure a $50 motherboard price difference and $250 for the CPU, and the Core i9-10980XE will still carry a $300 premium. Buyers who are willing to step up to HEDT may still be willing to choose Intel over desktop Ryzen, but the Ryzen 9 3950X offers horsepower the 9900K can’t even touch in the mainstream socket space.

Do you need a Ryzen 9 3950X? Probably not. If you’re a gamer, CPUs like the Core i7-9700K or the Ryzen 7 3700X are going to be far more cost-effective and practical. But the Ryzen 9 3950X represents the pinnacle of AMD’s single-threaded and multi-threaded performance. It offers both of those things in the same CPU rather than asking users to choose between one or the other. It demonstrates sustained, multi-year double-digit performance improvements, even considering the constraints of bringing a 16-core part into a dual-channel memory configuration — constraints which do not seem to bind third-generation Ryzen as much as some, including myself, were concerned they might. Further research is warranted, and there will be workloads where memory bandwidth is necessary, but the trends are positive. Most Ryzen upgraders will be better served by an eight-core chip, but AMD users who need the firepower 16 cores can offer will not find them lacking.

As impressive as the Ryzen 9 3950X is, it’s just the opening salvo in AMD’s November 2019 72-Core Launch Salute. Threadripper 3 is coming.

Now Read:

November 14th 2019, 9:56 am

Stadia Developers Reveal Missing Launch Features in Reddit AMA

ExtremeTech

Google’s Stadia game streaming service launches in just a few days, and some of the developers have taken to Reddit to host a surprisingly honest AMA (Ask Me Anything). The devs fielded questions ranging from how achievements will work to when iOS support will roll out. Unfortunately, the AMA revealed that device support will be very limited at launch, and some of Google’s core features won’t work right away. 

Google has moved with incredible speed to get Stadia ready for launch. It was just a year ago that Google tested its cloud gaming technology with Project Stream in the Chrome browser. However, it apparently has not had time to get all the necessary apps and streaming technologies ready to go. 

At launch, the only people playing Stadia will be those with Founder’s Edition bundles. Those kits come with a Chromecast Ultra, and it’ll be a special Chromecast Ultra. It will feature newer software that supports Stadia, and other Chromecasts Ultras won’t have that until sometime later. So, if you were hoping to play Stadia games on more than one TV, that’s not going to be possible at launch. While Google plans to support iOS, that app won’t be ready at launch. In fact, the Google dev couldn’t even speculate on whether it would be early or later in 2020. 

Several signature features of Stadia won’t be ready at launch even if you do have compatible devices. Games will support achievements, and the developers say Stadia will track them. However, there won’t be any way to see your achievements because the UI for that isn’t done yet. Family sharing won’t work out of the gate, either. Eventually, that should allow multiple family members to share a game like they would with a “real” console. However, you’ll need to buy multiple copies of games if you want more than one person to have access at launch. If you want to play with non-Founder friends, you’ll have to wait at least two weeks. That’s when the promised “Buddy Pass” codes will work. The custom Stream Connect multiplayer feature is not supported by any of the 12 launch titles, but Google hopes to have at least one game featuring that by the end of 2020. 

The Chromecast that comes with the Stadia kit is running special software.

There is some good news, though. The Googlers confirm that Stadia will support moving a game session between devices. Your game instances will remain active for about 10 minutes, allowing you to fire up the stream on the screen of your choice. Google will also offer a smartphone mount called “The Claw” for the Stadia controller. Of course, it won’t be available at launch. 

The first Founder’s Edition bundles should arrive on the 18th, but many buyers report delivery dates into December. One of the Googlers in the AMA offered to hand-deliver a kit to someone in the Bay Area. That’s a nice gesture, but it’s also emblematic of the messy launch.

Now read:

November 14th 2019, 9:12 am

NASA’s Curiosity Rover Spots Unexplained Oxygen Spike on Mars

ExtremeTech

The Curiosity rover has already had a tremendously successful mission on Mars, but it’s not done making discoveries yet. NASA reports that the rover has detected a perplexing increase in oxygen concentrations on the red planet. While the cause is unclear, this comes just months after the rover detected a spike in methane, another biomarker gas. 

Curiosity landed on Mars in Gale Crater more than seven years ago, and since then it has beamed back more data on the geology of the planet than scientists dared hope. The rover has discovered mineral deposits that point to water in the planet’s past, possible organic compounds, and numerous interesting geological features. 

Earlier this year, NASA reported that Curiosity had spotted unusually high levels of methane in the planet’s atmosphere. This is notable because methane is considered a biomarker, something that life produces. Scientists are still probing what that means for Mars, but now oxygen levels are moving upward to unexpectedly high levels as well. As with methane, oxygen can indicate the presence of biological organisms, so it’s something Curiosity monitors. 

The new oxygen data comes from the rover’s Sample Analysis at Mars (SAM) instrument, which the team uses to analyze the atmosphere as well as rock samples. Data from the last seven years shows that the air in Gale Crater is about 95 percent carbon dioxide with a smattering of nitrogen, argon, and just a touch of oxygen — 0.16 percent of it on average. There are seasonal patterns in the gas concentrations, but the new analysis shows that oxygen levels change much more than other gasses. During spring and summer, oxygen levels rise as much as 30 percent and then fall lower in winter. 

The team has investigated several possible explanations for the variation, but nothing quite covers all the observations. For example, more oxygen could appear from solar radiation splitting carbon dioxide or water molecules in the atmosphere, but there’s not enough water and carbon dioxide doesn’t split quickly enough. 

Like the methane spikes, NASA is still investigating the oxygen swings. However, the team cautions against jumping to conclusions about the presence of alien microbes. It is more likely that some geological process is responsible for the changes in Mars’ atmosphere, and we might know more soon. The Mars 2020 rover will head to the red planet next year, and it will have a suite of sensors more suited to searching for evidence of life.

Now read:

November 14th 2019, 9:12 am

Udemy Class Review: How to Create Android Apps Without Coding Advance Course

ExtremeTech

Android is everywhere nowadays. In addition to powering the user experience on essentially all non-Apple smartphones, Android is also used on numerous tablets, TV boxes, low-end computers and a variety of other devices. This has lead to the creation of an enormous burgeoning app market, and people with the technical know-how to create these apps are highly in-demand. If you’ve never dealt with coding a program before, creating an Android app can be exceedingly difficult, but as Udemy’s “How To Create Android Apps Without Coding Advance Course” will show you, there’s more than one way to create an app and some of them are relatively easy.

Course Overview

This course focuses on the development of Android apps using online app builder software. Using an app builder to create an app can be done without any coding knowledge, which makes the development process exceedingly easy compared with creating the app entirely on your own. Ultimately, though, you will have less control over the software you create using an app builder. You will be limited by the app builder in what sort of features you can add to your app, and you won’t be able to fine-tune the code to achieve optimal performance. If you knew how to do that, however, then you would probably be handcrafting your own app instead of taking this course anyway.

The lecturer starts the course off by reviewing several online app builders in terms of price and what features they offer. Ultimately, the lecturer recommends you to use an app builder referred to as Mobincube, which allows you to create an unlimited number of apps without any size limit for absolutely free. Mobincube supports this service by placing ads on any app you create, but this also benefits you as they will share some of the ad revenue with you. Mobincube also lets you create apps for iOS as well as Android.

In the next lecture, the course goes into detail about setting up the basic foundation of your app. This involves some minor image manipulation to create button images for your prototype app. The lecturer recommends Canva for this purpose, but essentially any image editing utility such as Adobe Photoshop, GIMP, or even Microsoft Paint will work. The images created using one of these utilities are then uploaded to Mobincube for use in your app.

The app builder itself has a user-friendly graphical interface that essentially lets you build an app by clicking and dragging objects around on an on-screen phone. The lecturer demonstrates this by creating a simple navigation menu. Adding images and content to the app is similarly easy and straightforward, after which you can connect your content and navigation menu together.

Conclusion

The remainder of the course goes over topics such as modifying the advertising content on your app and publishing it to the app store. Overall, I feel the lecturer did an excellent job of introducing this software to students in the class, but at the same time, I can’t help feeling that this class is realistically unnecessary. Mobincube has clearly worked to make the app production process quick and easy using their software, and overall it feels like something you could easily learn to use on your own without taking a class. I wouldn’t go so far as to recommend against taking this course, but I would highly recommend anyone interested in creating their own app try Mobincube first and then take the course if they feel they need some additional help.

Now read:

November 13th 2019, 5:32 pm

ET Deals: Apple AirPods Pro $15 Off Ends Soon, 4 Months Amazon Music Unlimited $0.99, $500 Off 65-In

ExtremeTech

Today you can get Apples recently released AirPods Pro headphones with a $15 discount. Pick a set up with Amazon’s current Amazon Music Unlimited promotion to enjoy four months of tunes for just $0.99.

Apple AirPods Pro ($234.98)

Apple’s new AirPods Pro utilizes a new design that’s different from the company’s older AirPod headphones. The key new feature that these headphones have is active noise cancellation. The headphones also use a custom driver and a high dynamic range amplifier to improve sound quality. These headphones just launched two weeks ago, but you can snag them with a $15 discount from Amazon that drops the price from $249.99 to $234.98.

Amazon Music Unlimited 4-Month Subscription ($0.99)

For a limited time, you can get four months of Amazon’s Music Unlimited service for just $0.99. This service will allow you to listen to countless songs, and you can download and save them to your phone for easy listening when you don’t have access to the net. This deal is available exclusively to Prime members, however, and after this four-month period, the price raises to $7.99 per month for Prime members and $9.99 for everyone else.

Vizio P659-G1 2019 Quantum 65-Inch 4k HDR Smart TV ($890.90)

Vizio’s P659-G1 features quantum dot technology to provide 115 percent more color than competing LCD TVs. This model also comes equipped with a built-in Chromecast and hands-free voice controls, and it supports Dolby Vision HDR. This TV is typically priced at $1,399.99, but you can currently get it from Amazon marked down to $899.99. It drops a bit further to $890.90 with a clickable coupon.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 13th 2019, 3:41 pm

Apple Redesigns the MacBook Pro, Boosts SSD Storage, Adds AMD RDNA, and Dumps Butterfly Keyboard

ExtremeTech

Apple’s rumored 16-inch MacBook Pro is out today. It’s a touch thicker, a touch wider, and it gets rid of one of Apple’s most-disliked and broken products over the past few years — the butterfly switch keyboard. After multiple attempts to fix the design, Apple is getting rid of it. The keyboard has been one of Apple’s most controversial changes, with keys that can jam on a single grain of dirt and an incredibly expensive repair process (until Apple started fixing them for free).

“Apple refers to its new board as the Magic Keyboard, whose name and physics are derived from the standalone keyboard that comes with the iMac and iMac Pro,” according to our sister site PCMag. “Travel distance is improved, to 1mm, and there’s a new rubber dome that complements the butterfly switch. Apple says the overhaul maintains the extraordinary stability of the previous board while preserving more potential energy, resulting in a more satisfying key press.”

Key responsiveness has been generally described as a cross between the butterfly keyboard that Apple used to use and a more traditional design. PCMag described the experience has “ordinary,” but thought that might be an improvement over the “jarringly shallow” design on the MacBook. Speaking personally, I always found the reliability issues far more concerning than the key response. Key response can make or break a keyboard as far as how much you like it, but I don’t need to like a keyboard to use it. I do, however, require it to continue responding to keypresses.

The new keyboard layout, with the Escape key and modified arrow keys. Image by Apple.

While the number beside the name is going up, don’t expect the 16-inch MacBook Pro to feel terribly different than the 15.4-inch MacBook Pro. According to Mark Gurman, the screen is only about 4 percent larger (and measured 15.4 inches previously). Some of the improvement is coming from thinner bezels, allowing the display to grow without using a physically larger panel. The new resolution of the display is 3072×1920, up very slightly from the previous 2880×1800. This change modestly improves the display’s PPI, from 220 PPI to 226 PPI. The display offers up to 500 nits of brightness and support for the DCI-P3 color gamut.

Other changes include a range of spec bumps, including storage options up to 8TB, with 512GB standard on the $2,399 model. AMD’s upgraded 5000M GPUs, based on its RDNA architecture, are also now available. The 5500M is a 24CU part with up to 1,536 stream processors, while the 5300M is a 20CU part with 1,280 stream processors. Both chips use a 128-bit memory bus, with the 5500M fielding up to 8GB of RAM and the 5300M using up to 4GB. These GPUs should have solid horsepower for professional and gaming tasks. Our evaluation of RDNA performance in professional applications suggested the GPU competed favorably against rival architectures from both AMD and Nvidia.

Laptop audio quality is rarely a strong point of any laptop, but Apple’s 16-inch MacBook Pro supposedly shines here as well, with what PCMag calls “the most robust bass I’ve ever heard from any laptop speakers.” Finally, the thermal solution has supposedly been improved to support both the Radeon and Core i9 options. Hopefully, we won’t see a repeat of the thermal issues that plagued the Core i9 when that platform was introduced.

PCMag has additional details on a few items we didn’t touch on, like the tweaked keyboard layout. Overall, the 16-inch MacBook seems like a coat of polish on an existing product rather than an all-new design. Here’s hoping it avoids the keyboard issues that bedeviled multiple generations of its predecessor.

Now Read:

November 13th 2019, 3:41 pm

Save Over 90 Percent On This Complete R Programming Certification Bundle

ExtremeTech

There’s a myriad of coding languages out there, meaning you have plenty of options to choose from whether you need to design a website or create a video game. But, when it comes to working with data, there’s really just one you need to know, and its name is R.

Looking to bolster your coding ability with some in-depth data and visualization concepts? Or maybe you want to start from scratch and learn everything there is to know? Thanks to the Complete R Programming Certification Bundle, you can take a deep dive into over 35 hours of training to perfect your data wrangling and visualization skills.

This bundle comes complete with 6 courses that contain hundreds of lessons. You’ll learn things like statistics and machine learning for regression models, social media mining and data analysis, how to classify and cluster data using R, data visualization with tidy techniques in R, and many other great skills. You’ll have 24-hour, lifetime access to all of the included courses so you can always refer back to something if you feel like you need a refresher on a specific topic.

The Complete R Programming Certification Bundle is on sale for $29 right now, saving you 97 percent. Each course is individually valued at over $200 each, so you’re saving a ton with the bundle and still reaping all the benefits, so start perfecting those R programming skills today.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 13th 2019, 2:45 pm

New Horizons Flyby Target ‘2014 MU69’ Now Officially Named ‘Arrokoth’

ExtremeTech

NASA’s New Horizons mission started making history the moment it launched. The probe set a record as the fastest (at the time) launch in history, and eight years later it became the first spacecraft to visit Pluto. More recently, New Horizons took a look at a small Kuiper Belt Object (KBO) known as 2014 MU69. Now, NASA has announced that object has a new name. It’s “Arrokoth,” a Native American term that means “sky” in the Powhatan and Algonquian languages.

New Horizons was conceived and built to get a closer look at Pluto. At the time of its launch, Pluto was the ninth planet in the solar system. Although, it was demoted to dwarf planet while New Horizons was en route. That didn’t make the eventual flyby of Pluto any less momentous, though. New Horizons revealed a complex landscape with frozen fields of nitrogen ice and mountains of water ice. It even has clouds. 

Because New Horizons was moving so fast, it couldn’t slow down and enter a stable orbit around Pluto. NASA took that as an opportunity to study other objects in the outer solar system, though. While New Horizons was on its way to Pluto, NASA identified several possible secondary targets including 2014 MU69. The NASA team nicknamed it Ultima Thule — Thule being a mythical realm shown on medieval maps, and “ultima” meaning beyond. So, 2014 MU69 was beyond the unknown. 

NASA submitted the new name to the International Astronomical Union and Minor Planets Center with the consent of Powhatan Tribal elders. “The name ‘Arrokoth’ reflects the inspiration of looking to the skies and wondering about the stars and worlds beyond our own,” said New Horizons principal investigator Alan Stern. Representatives from the Powhatan tribe were present at the ceremony to commemorate the new name with a traditional Algonquian song. 

You’ll probably be hearing a lot about Arrokoth in the future. The first images showed a “snowman-like” object, but it’s actually composed of two flattened disks. NASA calls this a “primordial contact binary” because it likely formed when two objects came into contact in the distant past. Studying Arrokoth could help us better understand planetary formation at the birth of the solar system. 

New Horizons is still going strong with enough power to operate into the 2030s. The team is currently considering where to point New Horizons next. They hope to complete at least one more KBO flyby in the 2020s.

Now read:

November 13th 2019, 1:45 pm

Valve Might Be Working on a Steam Cloud Gaming Service

ExtremeTech

Cloud gaming services are all the rage these days. Microsoft and Google are both making major pushes into the area, joining services like Nvidia’s GeForce Now and PlayStation Now. The exact features of each service differ, however, and they aren’t fungible — the game libraries you can access and the requirements as far as local hardware are different in each case. Now there’s some evidence that Valve might be looking to get in on the action as well, with rumors of a Steam Cloud gaming service popping up in several places.

Steam already supports remote streaming if your hardware supports it, and there’s the mobile Steam Link app you can use for streaming to an iOS or Android device, but the company hasn’t taken the plunge into full-on cloud gaming just yet. In theory, the tremendous back-catalog of titles on Steam could allow for unique streaming experiences you can’t get anywhere else, though the requirements of some of those games (from eras when the mouse and keyboard were the assumed standard for PC gaming in all cases) might be troublesome. Then again, Steam has improved its own controller support significantly and would surely provide a compatibility layer for older games as well.

The idea that the service might run as a companion over other devices is also interesting. Nvidia has services like GeForce Now, which might be seen as running in competition with a hypothetical Steam Cloud Service. Much may depend on how Steam structures its offering. The one major advantage Steam has, of course, is that Steam users have been plugged into the service for over a decade. The anger at the launch of the Epic Game Store was also a demonstration of how much market power Steam practically wields.

Of course, for now, we don’t have any details on how the project might shape up or what kind of service this would be. Anything from a premium add-on to basic existing Steam accounts or a free over-the-top service that gives you more freedom to play your existing library remotely would seem possible, depending on the kind of arrangements Steam wants to draw.

Whether any of these services can attract customers and build stable long-term businesses is still something of an open question. Stadia will launch next week and give us a look at Google’s idea of the future, but so far, most of the success in this field has been from services directly adjacent to the console business. If US customers start signing up for these services en masse, bandwidth caps in the US may suddenly start seeming awful small. The bandwidth requirements for game streaming, particularly game streaming in 4K, can be formidable.

Now Read:

November 13th 2019, 10:54 am

Researchers Find More Than 1 Million Alternatives to DNA

ExtremeTech

Life on Earth uses DNA and RNA to store and utilize genetic information, but what if there’s another way? A new analysis from researchers at Emory University and the Tokyo Institute of Technology suggests a plethora of molecules could serve the same basic task of organizing and storing genetic information. They estimate more than a million possible stand-ins for DNA, some of which could help us fight disease or help us know what to expect as we search for alien life. 

DNA (and RNA) consist of several components that make up the familiar double helix. There are the base pairs like adenine and guanine, a sugar (deoxyribose for DNA and ribose for RNA), and a phosphate group. The sugar and phosphate give nucleic acid an alternating sugar-phosphate backbone. We already know there are many alternatives to the five bases at work in DNA and RNA on Earth, but the new study looks at how the scaffolding of nucleic acid could vary. 

The team used a computer simulation to explore a so-called “chemical space” within certain constraints. To choose the constraints, the team had to distill what makes nucleic acid molecules distinct. They settled on organic molecules that can assemble into a linear polymer with at least two attachment points, plus a place for nitrogen bases to connect. The substructure of the molecule also needs to be stable in a polymer configuration. Since these molecules don’t contain the traditional sugars and phosphorus, you can’t call them DNA — they’re some other kind of nucleic acid with potentially similar properties. 

A few alternate nucleic acids.

The analysis points to more than 1,160,000 potential nucleic acid molecules. That number exceeded even the most extreme estimates beforehand, but researchers can now start looking at these molecules in a laboratory setting to see if they can work as a DNA alternative. The team says this shows evolution on Earth may have experimented with several different molecular designs for storing genetic information before DNA ultimately won out. 

Researchers around the world are working on therapeutic drugs that resemble nucleic acid, some of which could help combat viruses and cancer. A better understanding of these DNA alternatives could make those treatments more effective. And then there’s the importance to exobiology research. If we’re looking for evidence of extraterrestrial life, it might help to remember they could have genetic material using one of the other million possible molecules.

Now read:

November 13th 2019, 9:06 am

One of Intel’s Recent Bug Fixes Carries a Performance Penalty

ExtremeTech

When Intel released its report on 77 CPU bugs and security issues that it recently patched, it didn’t mention anything about any of the security fixes causing performance issues. As far as we know, there aren’t any performance implications for any of the patches we discussed in our previous story, as such — but there is a new Intel update that carries a small penalty hit.

Intel has discovered a separate erratum in its Skylake CPUs that has nothing to do with the Spectre or Meltdown issues we’ve previously discussed. We can’t fully discuss the issue because the website where one of the whitepapers is supposed to be isn’t currently serving up the paper. The other paper, which concerns mitigation efforts, is currently online and available here. The Jump Conditional Code (JCC) erratum is related to “complex microarchitectural conditions involving jump instructions that span 64-byte boundaries (cross cache lines).” According to Intel “the erratum may result in unpredictable behavior when certain multiple dynamic microarchitectural conditions are met.”

Performance Impacts

According to Intel:

The JCC erratum MCU workaround will cause a greater number of misses out of the Decoded ICache and subsequent switches to the legacy decode pipeline. This occurs since branches that overlay or end on a 32-byte boundary are unable to fill into the Decoded ICache. Intel has observed performance effects associated with the workaround ranging from 0-4% on many industry-standard benchmarks.

In subcomponents of these benchmarks, Intel has observed outliers higher than the 0-4% range. Other workloads not observed by Intel may behave differently. Intel has in turn developed software-based tools to minimize the impact on potentially affected applications and workloads. The potential performance impact of the JCC erratum mitigation arises from two different sources:

1. A switch penalty that occurs when executing in the Decoded ICache and switching over to the legacy decode pipeline.

2. Inefficiencies that occur when executing from the legacy decode pipeline that are potentially hidden by the Decoded ICache.

Intel is working to fix this problem with toolchain and software updates and worked with Phoronix to get these changes into software so they could be evaluated. The remainder of the document is taken up with discussions of how to mitigate the issue and with details on which CPU families are affected. Affected chips include Amber Lake, Cascade Lake, Coffee Lake, Kaby Lake, Kaby Lake X, Skylake, and Whiskey Lake — so basically, everything back to Skylake. CPUs prior to Skylake are not affected, even though the cache changes that give rise to this error were introduced in Sandy Bridge.

According to Phoronix’s extensive tests, the average impact hits performance “by a couple of percent,” some of which can be recovered by compiler patches and updates to Linux that will take some time to be merged in updates and to trickle back down to users. It’s not clear what sort of timeline Windows users should expect or what performance losses look like in that operating system.

Data and graph by Phoronix

This single result from Phoronix shows the broad overall pattern of a performance loss with the initial microcode update, followed by partial recovery with the new patched code. There are other application tests that exceed the 4 percent threshold that Intel identified, but these appear to be outliers. The new microcode is sometimes faster than old microcode, period, and the patches that Intel has performed clearly aren’t finalized yet; there are still some places where the new code actually hits performance harder rather than helping. The point of publishing this data now, according to Phoronix, was to illustrate that the drops may be temporary.

Ultimately, this kind of move is likely the result of Intel cleaning house and conducting security reviews of its own products, then moving to patch errata, even those that might impact perf. That’s going to frustrate users who see performance dips, and the impact of these dips can exceed the 4 percent threshold, but it’s also the right move for the company to make long-term. Hopefully, updates to software toolchains and OS support will minimize the performance impact of these changes, which, again, appear to be unrelated to any of the issues we’ve discussed with Spectre and Meltdown.

It’s not yet clear if this fix is part of the bundle that Intel announced earlier on Tuesday, or if it will be delivered separately. Thus far, most of the performance impact of Spectre, Meltdown, and related fixes has hit server software more than client. Assuming that holds true, end-users should see few declines. If it doesn’t, you’ll hear about it here.

Now Read: 

November 13th 2019, 9:06 am

IBM: Our Mac-Using Employees Outperform Windows Users in Every Way

ExtremeTech

Today, IBM published the results of an internal experiment that undoubtedly sent Apple’s sales team leaping for joy. Since launching its Mac@IBM program, which allowed users a choice in which computer platform they preferred, the company has tracked the results. Said results have been quite favorable for the Mac-using IBM staffers, in multiple metrics.

First, keep in mind that these results are being revealed at the JAMF Nation User Conference. JAMF is a software platform for administrating Apple devices. The company isn’t just announcing this out of the blue; it’s making a statement of support from the stage of a major software partner. That doesn’t mean these statements are wrong, but it’s an important context for evaluating their accuracy.

According to IBM, one staff member can support 5,400 Mac users, while the company needed one staff member per 242 PC users. Only 5 percent of Mac users called the help desk for assistance, compared with 40 percent of PC users. This Mac-IBM love affair has been ongoing for a few years, and the same IBM PR points out that in 2016, IBM CIO Fletcher Previn declared that IBM saves anywhere from $273 to $543 when its end users choose Mac over PC.

This year, the company gave even stronger evidence in favor of Macs over PCs. Supposedly 22 percent more macOS users exceed expectations in performance reviews compared with Windows users, while high-value sales deals tend to be 16 percent larger for macOS users. Mac users also have a higher “net promoter score” of 47.5 versus 15 and are 17 percent less likely to leave IBM. Mac users are also happier with third-party software availability at IBM, according to IBM’s own press release. In addition, Mac users are more likely to report that migration is simpler compared with Windows 7 -> Windows 10. Windows users are nearly 5x more likely to need on-site help support.

All of this adds up to a towering mountain of praise for macOS and a mountain of scorn for Windows. But given that IBM hasn’t released the full data set from which it made these calculations or detailed the manner in which it surveyed its employees, I feel obligated to point out that some of the effects being claimed here are astonishingly large, with no clear indication of what might be producing them.

IBM may have already addressed the issues I’m going to raise, but it doesn’t say that it’s done so — and these issues are important.

Confounding Variables

To properly measure the impact of using macOS versus Windows, we need to know more about the specific users who have chosen Macs. What branches of the company do they work in? Are they in the upper echelons of employment or are they entry-level workers? Which groups of people, specifically, have chosen to take advantage of IBM’s Macintosh offer?

Image by 9to5Mac.

Because — if you think about it — selection bias could explain every single one of IBM’s claims. We can assume that some of the people who chose to use a Mac at IBM did so because they were already familiar with Apple’s ecosystem and preferred the product family. Because Apple has a minority market share, many Apple users are also fluent in Windows. Apple products tend to cost more than Windows systems, so people who choose these platforms may have more disposable cash in the first place and work in higher echelons of the company.

I haven’t been able to find any details on the Mac@IBM program, but it’s possible that the salespeople and employees who chose to convert were also more likely to be the higher-performing employees in the first place. We know IBM has had the Mac@IBM program since 2015, but it’s not clear if the program is a simple “any employee can use a Mac if they want,” or if it’s only offered to some employees. Even if low-level office staff have the same chance to step up to a Mac as a high-level engineer, individuals in those positions may simply be less likely to do so. The reason high-value sales deals may correlate to macOS usage is that IBM’s best salespeople may be more likely to use a Mac. It may be that the Mac@IBM program was first offered in high-value sales teams, or that those teams took advantage of it first. There are a lot of effects to tease apart here.

I’m not saying IBM’s data is wrong — only that we haven’t been given enough information to know if some attribute of macOS explains these differences, or if the fact that employees are less likely to leave and more likely to exceed expectations reflects other personality characteristics of the employees in question.

But there is a way for IBM to make a stronger version of this argument. It needs to show long-term longitudinal data indicating that the employees who switched from Windows to macOS saw their sales figures jump, their retention rates rise, and the number of helpdesk calls falls. It would still be important, however, to separate out the group of Mac users who were already fully versed in macOS when they adopted it at IBM from the group of Windows users who were new to macOS and adopting it for the first time. After all, increased performance in the Mac-friendly group may reflect the fact that they are using the OS they previously preferred to use. It would also be helpful to see the strength of the improvement across multiple separate areas of IBM’s business.

The gold standard for a “macOS is intrinsically better than Windows” argument is a group of Windows users who saw their overall performance increase, help desk calls drop, and other improvements in other employment-related metrics after switching to macOS for the first time, for reasons that cannot be explained by any other change in their job responsibilities. The difference in help desk calls and the number of employees per IT staff member may be reflected in the relative level of computer knowledge required to do the work. If secretaries need more computer help than high-level engineers, and the high-level engineering staff is more heavily represented in the Mac user pool, this would explain the gap in support staff requirements. In this case, the gap would have nothing to do with the relative merits of macOS versus Windows.

Feature image by Mashable

Now Read:

November 13th 2019, 8:18 am

ET Deals: Disney+ Streaming Starts Today w/ Free 7-Day Trial, $30 off Echo Show 8 Preorder, $800 off

ExtremeTech

Disney’s new Disney+ streaming service officially launched today. Featuring tons of Disney TV shows and movies, this service is targeted as the best streaming solution for kids, but it also has plenty of content for adults including Marvel and Star Wars movies and National Geographic documentaries. If you are interested in trying this service, you can get a free 7-day trial, after which it’s just $6.99 per month.

Featured Deals

For more great deals go to TechBargains.

 

November 12th 2019, 3:53 pm

How About a $7,500 Federal Subsidy to Bring Back the Stick Shift?

ExtremeTech

Electric vehicles now outsell cars with manual transmissions by almost two to one. This piece of information from JD Power underscores what enthusiasts hate to acknowledge: The manual gearbox is slower to shift, doesn’t save gas, and sometimes burns more gas than a car with an automatic transmission or a double-clutch automated mechanical gearbox that has no clutch pedal.

This change was underscored most recently with the unveiling of the most iconic American sports car, the Chevrolet Corvette. It arrives next year with only an eight-speed dual-clutch automatic connecting the engine and the rear wheels. That’s a double blow to the Corvette traditionalist, who now has to cope with an engine mounted behind the driver.

VW GTI: Heel-and-toe, clutch, rev the engine, downshift third to second, be a hero on a winding road. Or these days, flick the paddle shifter on the left side of the steering wheel to drop down a gear and automatically rev-match the engine and drive wheels.

The reasons for moving from stick shift to automatic are many. The main one is convenience. It’s fun to shift gears by pressing a clutch pedal and rowing the console lever on a twisty country road. It’s not so much fun in rush hour traffic when you have to use the clutch 150 times shifting to neutral while covering the last two miles to New York City’s Lincoln Tunnel (I counted) at rush hour. More Americans are moving to urban areas where there’s more stop-and-go traffic.

Automatic transmissions are not significantly less reliable, unlike a generation ago. They typically match a manual transmission on fuel economy. Often they get better economy. Consider the 2019 Honda Fit:

If the manual gets worse fuel economy, that hurts the automaker’s overall CAFE rating, or corporate average fuel economy.

Everyone knows a diehard manual-transmission fanatic with time on his hands and a willingness to tell you how much better mileage he gets with the manual. It’s all anecdotal.

Automatics shift faster than the average driver and often faster than a skilled race driver. That typically means faster acceleration. Not in all cases.

If the car goes away from a manual transmission and if the automaker chooses to put the gear selector on the steering wheel, or dashboard, or even as a small knob on the console, that leaves room for the important things in life (to the average driver and passenger): cupholders, USB jacks, slots and cubbyholes for phones, sunglasses, keys and the other crap we carry with us.

By 2018, sales of electric vehicles in the US matched sales of cars with manual transmissions.

Research by JD Power shows the crossover point was in 2018. Manual-gearbox cars slipped from 1.8 percent of sales to 1.6 percent. Battery electric vehicles (BEVs), which have been on the market for about a decade, jumped from 0.3 percent share in 2015 to 1.8 percent in 2019 and to 1.9 percent for 2019 year-to-date. This is for retail sales only; Power says including fleet sales (which are even more heavily automatics) skews what the individual buyer does.

Power continually sifts data to find trends for its customers and for conferences it hosts, such as last month’s inaugural Auto Revolution confab in Las Vegas targeting analysts and demographers as well as car dealers and component suppliers.

If you want a stick shift car, now’s the time to buy. Car and Driver counted 40 cars with manual transmissions as of the beginning of the year, including the outgoing Corvette. There are also about a dozen crossovers and SUVs with manual transmissions, CD says. Most are smaller vehicles, from Fiat, Honda, Jeep, Mitsubishi, and Subaru.

If you want to be among people who value manual gearboxes, leave the country. Manuals elsewhere account for more than four in 10 sales.

And in the US, a manual gearbox keeps your car out of the hands of joyrides and younger thieves. A car with a clutch is its own millennial anti-theft device.

Now read:

November 12th 2019, 3:53 pm

Netflix Is Killing Support for Some TVs and Roku Boxes Because of DRM

ExtremeTech

Netflix became an online video streaming behemoth by adding support for as many screens as possible. However, some devices will lose access to Netflix streaming in the coming weeks, and we finally know why. It’s all thanks to everyone’s favorite aspect of digital media: digital rights management (DRM). It turns out some older streaming devices can’t run the newer Netflix DRM, so they’ll just get locked out. 

In order to convince the notoriously squeamish media companies to license their content for streaming, Netflix had to implement DRM right from the start. Yes, there are much more convenient ways to pirate content, but Netflix doesn’t have a lot of choice in implementing DRM. It does, however, get to choose the type of DRM it uses. Back in 2010, Netflix started using Microsoft’s PlayReady DRM. Before that, it used Microsoft Windows Media DRM. Yes, that’s a lot of Microsoft products, but the companies have a long and storied history together — Netflix was the only major site that used Silverlight as a streaming technology back in the day, and it released a Windows Phone app before Android.

The problem with Netflix’s DRM decision is that it still allowed manufacturers to add Netflix support with the older Windows Media DRM. Some TVs with Netflix support on the older DRM shipped as late as 2014. Since these devices can’t operate with the newer PlayReady DRM, Netflix will cut them off from the service on December 2nd. 

The original Roku XD is among the players getting the boot.

The affected devices include several models of Samsung and Vizio TVs, as well as eight Roku streaming box models including the Roku HD and XD. Samsung says it stopped selling the affected TVs around 2011, but Vizio sold them until 2014. While there are a lot of Roku streaming boxes on the list, Netflix has always worked closely with the company. After all, it helped launch Roku as an alternative to building its own streaming box. Roku stopped selling devices with the old DRM quickly, so all the affected boxes are from no later than 2010. Nine years of use from a streaming player that cost less than $100 isn’t bad. 

Netflix says it’s been communicating with users of affected devices for several weeks via both email and on-device alerts. The good news is that Netflix works on almost anything with a screen these days. A $30 streaming player like a Roku Express can restore your access to Netflix.

Now read:

November 12th 2019, 2:23 pm

New Spectre-Related CPU Flaw Tops Intel’s Latest Critical Security Fixes

ExtremeTech

Intel has announced a large number of patches and fixes for dozens of security problems in its products and processors. The company has provided a total of 77 patches to OEMs and partners as part of its Intel Platform Update program. We were briefed on this update prior to the formal announcement, but the documents Intel has provided are a bit vague and the links that should lead to the write-ups themselves on the nature of these issues aren’t actually live yet. The flaw that Intel spent the most time discussing, meanwhile, isn’t the highest-ranked security problem of the list.

According to Intel, it is fixing 77 security flaws with this raft of patches. 67 of the flaws were found internally at Intel, while 10 were discovered by outside researchers. At least one of the CVE vulnerabilities, CVE-2019-0169, has a CVSS rating of 9.6 (ratings of 9 – 10 are considered critical, the highest severity). As of this writing, the webpage for CVE-2019-0169 is a placeholder, but we’ll have more to say as soon as we can tell what this vulnerability does. It appears to be located in the Intel Management Engine or one of its subcomponents.

The first set of fixes are various aspects of Intel’s command-and-control hardware, including the Intel Management Engine (IME), Converged Security and Management Engine (CSME), Intel Server Platform Services (SPS), Trusted Execution, and the like. It’s clear that Intel has been laying the groundwork for a major security update — there’s a CSME Detection Tool available online dated to September 4, and various laptop manufacturers have been pushing UEFI updates for IME security issues since late September. The design and security of the IME have been strongly criticized by security researchers over the years, mostly for being an entirely black box and impossible to evaluate. The security processors used by ARM, AMD, and Apple have all faced similar complaints.

Intel’s paraphrased description of CVE-2019-0169 (which has not been published as of this writing) is that it concerns a heap overflow in a subsystem of the Intel CSME and one in the Trusted Execution (TXE) subsystem. These flaws may allow an unauthenticated user to enable privilege escalation, disclose information, or launch a denial of service attack via “adjacent access.” Adjacent access” is not defined, but is positioned against terms like “local access” or “network access.”

We can’t describe most of these vulnerabilities in detail, but CVE ratings of 8+ are generally significant and should be acted upon. The fact that UEFI updates have already been pushed for laptops means it might not be a bad idea to grab one.

TAA: Transaction Asynchronous Abort

Intel did describe one of these new vulnerabilities in somewhat more detail. TAA, or Transaction Asynchronous Abort, affects the TSX capability of Intel microprocessors. TSX was a capability initially introduced with Haswell that improves the CPU’s performance in multi-threaded software if the feature is used. Like the earlier Intel MDS disclosure, TAA can be used to leak data out of microprocessors because data from speculative execution steps that is not intended to be used can still be leaked and then retrieved. There is no way for the attacker to force any particular bit of data into a leakable state (there’s no direct way to control what leaks, though an attacker can try to influence it).

Intel’s guidance on which CPUs are affected is extremely precise and maximally unhelpful. The company lists three types of products which are not impacted:

Chips without TSX support.
CPUs that enumerate IA32_ARCH_CAPABILITIES[TAA_NO] (bit 8)=1.
CPUs that support TSX but do not enumerate IA32_ARCH_CAPABILITIES[TAA_NO] (bit 8)=1 do not need fixes beyond those already baked into Intel’s MDS fixes.

CPUs based on Whiskey Lake, Coffee Lake R, and 2nd Gen Scalable Xeons all require fixes if the systems support TSX.

The practical impact of this problem is likely to be limited, but Intel couldn’t have made it more difficult to determine which CPUs aren’t impacted if it tried. Listing the enumerated values of specific CPU fields is only helpful if those values are readily available for each individual CPU a person might own. Intel has web pages devoted to detailing MDS fixes at the per-CPU level, but none of the information on those pages corresponds to the values given above. As such, it’s useless for determining whether or not you have a CPU with a vulnerability. It would be better to identify the specific CPU families or models, even if that leads to rather long lists. A switch has been added to the UEFI of affected products to allow TSX to be turned off, and Intel’s guidance is that consumers who have the feature but don’t use it should disable it.

As always, ExtremeTech recommends keeping your system up-to-date. Don’t deliberately leave security holes open for hackers to walk into. At the same time, keep in mind that no one has detected any real-world attack based on Spectre or Meltdown. We may have more to say about the other items on this list depending on what they turn out to be.

Update, 1:40 PM: Added Intel’s description of CVE-2019-1069.

Now Read:

November 12th 2019, 1:54 pm

At a Glance: Zotac GeForce GTX 1660 Super Twin Fan Review

ExtremeTech

Nvidia’s new GeForce GTX 1660 Super graphics cards are priced just slightly higher than regular Nvidia GeForce GTX 1660s. They have been upgraded with GDDR6 memory to boost performance. In this review, we will look at Zotac’s take on the GTX 1660 Super to see how well it performs against the current competition.

Product Overview

GDDR6 boosts memory bandwidth available to the GPU considerably. Standard GeForce GTX 1660 graphics cards have just 192GB/s of bandwidth, but the GeForce GTX 1660 Super has 336GB/s of bandwidth. This keeps the graphics processor better fed with data, which in turn is then able to perform better.

Zotac’s GeForce GTX 1660 Super Twin Fan graphics card comes with a dual-fan cooler that sits on an aluminum heatsink with liquid-filled heatpipes that make direct contact with the GPU. This card also provides active cooling for the GDDR6 VRAM.

Zotac opted to not factory overclock this card, which means it operates at Nvidia’s suggested clock speed of 1,530MHz. The GeForce GTX 1660 Super Twin Fan’s boost clock is also unchanged at 1,785MHz.

Benchmarks

Our sister site PCMag received one of these graphics cards from Zotac and tested it against several other GPUs.

Testing against several other graphics cards showed that Zotac’s GeForce GTX 1660 Super Twin Fan graphics card came directly between the GeForce GTX 1660 and the GeForce GTX 1660 Ti in 3DMark’s Fire Strike Ultra test. This is also the case with 3DMark’s Time Spy test.

The GTX 1660 Super maintained a steady lead above the GTX 1660 in Unigine’s Superposition 1.0 test. It also handily outperformed the XFX Radeon RX 590 Fatboy, and overall it performed closer to the GeForce GTX 1660 Ti than the vanilla GTX 1660.

In real-world game tests, the GTX 1660 Super continued to perform well. At least for the most part; whereas the GTX 1660 Super showed sizable performance gains over the RX 590 and GTX 1660 in Far Cry 5, it’s performance in Shadow of the Tomb Raider was significantly lower. It actually returned the worst performance in this last test, but this appears to be due to poor driver optimization and not a fault of the card itself.

Conclusion

Overall, Nvidia’s GeForce GTX 1660 Super is a rather strange graphics card for Nvidia to introduce. Nvidia set the MSRP on the GTX 1660 Super GPU just $10 above the vanilla GTX 1660. Zotac’s model, featured in this review, is just $10 higher than that at $239.99. This removes almost any reason to buy the regular GTX 1660 as the GTX 1660 Super has shown to perform so much better.

At the same time, it’s harder to justify springing for the more expensive GTX 1660 Ti as the performance gap between it and the new GTX 1660 is relatively small. In general, if you are buying a graphics card for between $200 and $300, this is the one I’d recommend.

Now read:

November 12th 2019, 12:51 pm

Google Plans to Shame Slow Websites in Chrome

ExtremeTech

One of Google’s stated goals, when it launched the Chrome browser, was to make the web faster. It has followed that up with initiatives like search ranking that factors in page loading speed and the AMP mobile web format. Google’s next idea is the shame websites that have a tendency to load slowly in Chrome

We’ve all landed on a web page that takes its time loading. You have to wonder in situations like that if there’s something wrong on your end or if it’s the website’s fault. Google’s proposed alert system would clarify matters in a typically passive-aggressive Google fashion. 

While the exact design and nature of the slow loading notification are up in the air, Google is leaning toward a specially themed “splash screen.” It’s currently experimenting with pages that say things like “Usually loads slow.” Meanwhile, sites that load quickly could get their own UI tweaks. For example, pages that show consistently snappy page loads could have a green progress bar instead of the regular blue one. Google is also toying with the idea of adding page loading information to the context menu, allowing insight into a page’s experience even before you start loading. 

So, if you find yourself waiting on a sluggish website, the blank page might be replaced with a notice that the page usually loads slowly. This isn’t something Google would decide in the moment, though. The slow badge would only hit sites that have a historical tendency to load slowly. Later, Google might expand the warnings to include identifying when page loads are likely to be slow based on a user’s device and network conditions. 

This effort is in the extremely early stages, so only a small number of users will see the warnings, and it sounds like Google will focus mostly on mobile. The Chrome team also doesn’t want to smack developers over the head with unachievable page-loading standards. Part of the development process will be determining what criteria are important when labeling sites as chronically slow. 

For now, Google suggests web developers who are worried their sites might be too sluggish should use the PageSpeed Insights and Lighthouse tools. It will provide more details of the badging program as it takes shape in the coming months.

Now read:

November 12th 2019, 11:18 am

Google Struck a Deal to Secretly Access Health Data on Millions of Americans

ExtremeTech

America’s data privacy laws aren’t bad so much as they’re nonexistent. There’s no general federal data privacy law at all, and only a few states have attempted to pass meaningful legislation on the topic. While laws like HIPAA (Health Insurance Portability and Accountability Act) do have something to say about who is allowed to access patient medical records without the patient’s consent, it’s clear now that even this law is woefully inadequate to the privacy challenges of the 21st century.

Google has a deal with the second-largest health-care systems in the United States, Ascension, to gather and crunch data on millions of Americans across 21 states, according to the Wall Street Journal. The initiative is codenamed “Project Nightingale,” and is described as “the largest in a series of efforts by Silicon Valley giants to gain access to personal health data and establish a toehold in the massive health-care industry.” Amazon and Microsoft are also described as muscling into the medical industry, though apparently they have yet to strike deals quite this large.

The Data Isn’t Anonymized

The WSJ claims that the data “encompasses lab results, doctor diagnoses, and hospitalization records, among other categories, and amounts to a complete health history, including patient names and dates of birth.” Neither patients nor doctors have been notified in their inclusion in these data sets. All of this is legal under HIPAA, which allows hospitals to share data with business partners so long as the information is being used “to help the covered entity carry out its health care functions.” Apparently some employees of Ascension attempted to raise concerns about how this data was being used, but their complaints were dismissed, according to the report.

In the hypothetical universe in which Google intended to carry out this research in good faith, it would announce its efforts, accept only data from patients who opted in, conduct the difficult work of contacting all of those patients or their next of kin, pay their families for the value of the data it intended to mine from their lives, thoroughly anonymize the data, and take other various steps to establish trust when handling something as sensitive as a person’s medical data. In this fantasy, Google would also recognize that a great deal of data misuse and abuse happens because data is passed off to an endless succession of third parties and that it had a moral obligation to ensure that the valuable information it gathered would not be misused. Comfortable with its own ability to take on this responsibility, Google would publicly discuss how it protected our data.

But all of that is difficult. It’s much easier to secretly negotiate access to the information and build databases on people’s medical histories without consent. It’s easier for Ascension to ignore its own employees when they raise ethical concerns about these arrangements. It’s easier to take advantage of a loophole in federal law than to admit that this loophole is bad and needs to be closed.

Google undoubtedly has a lot of arguments about how it’s doing this for the best of reasons. That’s unsurprising. It was Google’s Larry Page who first said that medical data should be public knowledge in the first place. Larry Page, billionaire, and CEO of Alphabet, apparently cannot conceive of the idea that someone might be discriminated against if their private medical information became public knowledge. In his comments on this topic, Page has argued that there is no reason for anyone to hide this information and that he believes people do so because they are afraid of not qualifying for insurance. The idea that people might struggle to find employment or face other sorts of discrimination as a result of chronic illness or injury did not seem to have occurred to him in 2013. If it’s occurred to him since, he’s kept quiet about it.

The Lack of Disclosure Is a Problem. So Are Some of the Goals.

The WSJ takes pains to note that Google wants to build AI engines to better diagnose patients, while Ascension is looking for ways to improve outcomes and save lives. This is probably true. A lot of people are aware of how deeply broken the US healthcare model is, and how great the need for solutions is. The problems are complex because the system is incredibly complex. An AI system for effectively and quickly diagnosing patients that can deal with far-flung locations and treat or analyze patient data remotely and cheaply is an intrinsically attractive idea. People who want to be helpful go into these fields hoping to do something about their problems.

But if there’s one thing we’ve hopefully collectively learned from privacy disaster after privacy disaster, it’s that we can’t just emphasize the positive. The WSJ writes that Google is working with Ascension at no cost because it wants to build a healthcare database it can sell to other providers. Ascension, for its part, openly acknowledges that one of the goals of its program is to increase revenue from patients.

Ascension, a Catholic chain of 2,600 hospitals, doctors’ offices and other facilities, aims in part to improve patient care. It also hopes to mine data to identify additional tests that could be necessary or other ways in which the system could generate more revenue from patients, documents show. Ascension is also eager for a faster system than its existing decentralized electronic record-keeping network. (Emphasis added).

Given that the cost of interacting with the US medical system has been rising for decades, it’s appropriate to ask why it’s appropriate to adopt a new medical system on the basis of extracting higher revenue from patients. Ascension is supposedly a non-profit, religiously affiliated healthcare organization. US healthcare cost growth is out of control, and employees shoulder an ever-larger share of that burden.

It could be very reasonably argued that focusing on increased revenue per patient over the past 46 years has produced charts like the above. On top, us. Below us, everyone else.

Expensive AI-driven systems are not going to be adopted because they identify revenue opportunities with marginal value — say, targeting rich people who might like to have a little more elective plastic surgery. The push for such systems is going to happen in part because they’re good at finding new revenue sources. And over-testing is already a huge problem in the American healthcare system.

Waste, in total, is estimated to account for roughly 25 percent of all American healthcare spending. Of the estimated $760 – $935B in wasted American healthcare costs as of 2019, between $77B and $102B is estimated to be caused by either over-treatment or poor-quality treatment. It’s one of the largest single categories. This is not to say that people who need tests shouldn’t get them — of course they should — but when evaluating patients to see if they are receiving proper tests, the focus should also be on making certain tests are not performed unnecessarily. If Ascension considered the moral obligation it had not to charge people for tests they didn’t need, the WSJ does not mention it.

Now that they’ve been discovered, Google and Ascension are claiming to have had users’ best interests at heart… just not enough to tell them in advance. Google has been found to be violating the privacy of its own users in so many various ways over the years, from Google Plus to Android, it strains credulity to see these issues as individual innocent mistakes. Now we discover the company is doing it again, this time with personal medical data it ought to have no legal right to receive in the first place.

I bet they even promised it would only be shared with trusted partners. You know — like the 150+ Google employees that already have access to the personal healthcare records of tens of millions of Americans, according to the WSJ.

Now Read:

November 12th 2019, 10:35 am

Russia May Require PC, Phone Vendors to Preload Apps

ExtremeTech

The Russian government is already working on its own version of the internet. Now, it may be preparing to deploy its own customized spyware applications as a mandatory component of the service. The Russian Parliament is debating a bill that would require all electronic equipment sold in Russia to carry specific Russian apps and services. The bill would apply to all electronic equipment in specific device categories, like smart TVs, computers, and smartphones.

The goal of the bill is to “protect the interests of Russian Internet companies and will reduce the abuse by large foreign companies, working in the field of information technology,” ZDNet reports. If the bill passes, the Russian government will publish a list of electronic devices expected to be covered, along with a list of software that must be preloaded on them. Vendors that refuse to comply will be subject to repeated fines and eventually banned.

The official explanation for this is that Russia wants to protect its local tech scene and foster Russian technical competitiveness. There may well be truth to this. Russia does not view itself as particularly aligned with the West’s vision of governance or internet governance and it may indeed wish to feed the distinctive Russian nature of its own tech industry. Forcing Western companies to ship Russian software will undoubtedly boost software revenues.

Of course, the Russian government also might be mandating the installation of certain software so it can backdoor these applications and be guaranteed full access to the underlying device. Then again, it’s not clear the Russian government would even need to take this step. Why bother to backdoor an application when you have a legal agreement with the company requiring it to hand over the full account data of any person upon demand? Why go to the trouble of creating a detectable backdoor when you can use deep packet inspection to watch what people do online? Why bother performing all of the data monitoring yourself if you can require corporations to monitor it for you and send regular reports on problematic users?

The NSA was perfectly willing to bug Google data cables to spy on the information the company exchanged between its own private server farms, alongside a number of other dubious behaviors. It seems unlikely that Russian intelligence services are more principled in their respect for user privacy or individual rights. If approved, the bill will enter force on July 1, 2020. The bill is expected to pass, having already gathered the support of all the major Russian political parties.

Now Read:

November 12th 2019, 9:33 am

SpaceX Deploys 60 More Starlink Satellites in Record-Breaking Launch

ExtremeTech

What a SpaceX Starlink satellite looks like in orbit.

It has been a quiet fall for SpaceX, which launched a Falcon 9 rocket early August before taking a break to prepare for future missions. Now, SpaceX has successfully deployed a new batch of Starlink internet satellites, and the Falcon 9 that delivered them made history in the process. At this rate, SpaceX could begin offering internet access by the middle of 2020. Although, not everyone is happy about the growth of the Starlink constellation — astronomers worry Starlink could interfere with ground-based observations and other satellites. 

SpaceX sent the first batch of Starlink satellites into space earlier this year as a test. Some of the devices failed to operate, but SpaceX has since refined the software. CEO Elon Musk was able to send a tweet via the Starlink network recently, but the system needs at least a few hundred satellites for what SpaceX calls “moderate” coverage. The latest launch of 60 satellites gets SpaceX on the way there. 

Unlike other satellite internet systems, Starlink aims to provide very low latency connections. Many of the satellites will remain in very-low Earth orbit (VLEO) to improve connectivity. SpaceX has promised latency as low as 15ms, which would be very impressive if true. 

Initially, SpaceX only planned to launch a few thousand satellites to power Starlink, but that number has since ballooned to at least 30,000. The company hopes to have 2,000 in orbit by the end of the year. Astronomers worry that the high reflectivity of thousands of satellites will throw off observations of distant objects. Musk says the team is looking at ways to reduce the albedo (or reflectivity) of the satellites in future launches. With thousands of satellites in the constellation, there is also concern they could collide with other satellites

The Falcon 9 that delivered the second batch of Starlink satellites to space also set a record for reusability. This was the booster’s fourth launch, having landed after three previous missions. While SpaceX managed to land the rocket a fourth time, it’s unclear if it will get another mission. This was also the first mission to use refurbished fairings — those are the aerodynamic shells that cover the payload on the second stage. SpaceX just started collecting those for reuse recently. 

SpaceX isn’t the only company planning a large satellite constellation, but its unparalleled rocket reusability should help it realize the strategy on the cheap.

Now read:

November 12th 2019, 8:31 am

How to Choose a Cloud Service for Your Mobile Photography

ExtremeTech

Now that most people — even those who are serious about photography — are using smartphones for many or most of their photos, it’s a good time to look past the quality of those images to a robust workflow that includes not just editing, but organizing and storing them reliably. Stories of laptops getting stolen or destroyed and taking all of a person’s photos with them are quickly being replaced by sad tales of lost or crushed phones. Organizing and storage solutions span the gamut from simple to complex, with a variety of pros and cons. We’ll take you through some of the options and the tradeoffs.

Google Photos: The Roach Motel of Cloud Photo Storage

Perhaps the most popular solution to both processing and storing smartphone images is Google Photos. It can automatically sync images your phone to Google’s servers, where you can get at them from anywhere. This useful capability has some issues, though. First, unless you pay for storage, or have a newish Pixel, your images aren’t stored in their full resolution. If you think of Google Photos as your lifetime archive of images, that is a high price to pay.

Second, it is hard to control where your images actually live. Google is forever offering to remove them from your phone (only downloading them as needed for viewing), leaving you completely dependent on the Googleplex. Third, it has become a roach motel. It is difficult to download images, or even to guarantee an album is available offline. And the really useful feature that used to allow you to view your images under Google Drive — which you could then sync to a local server or NAS using WebDAV — was unceremoniously dumped. I’ve been unable to find a replacement approach to keep an archive of my Google Photos images, and Google hasn’t been able to suggest any.

Google desperately wants to be your one-stop-shop for your mobile photography. It offers a lot for free, but with strings.

So, if convenience is your primary value, then Google Photos provides it. But it shouldn’t be the only place anyone serious about their photo library keeps it stored. So let’s look at some other options.

Adobe Creative Cloud Is Very Cool, If You Can Afford It

Adobe offers a storage solution that operates a lot like I wish Google Photos would. Once you install Lightroom Mobile on your phone(s), it can automatically sync those photos to your Adobe Cloud, and then, in turn, down to your desktop copy of Lightroom. And you can even do the reverse, so you can work on your images (possibly using smart previews) on your mobile device, even if they were originally on your desktop. By default, only photos you take with Lightroom’s camera or add to Lightroom Mobile are synced, but you can enable an option to add all your phone’s photos.

Adobe provides a powerful web interface for your mobile images and those you have chosen to sync from your desktop

Other than the overhead of setting it up, the biggest drawback to using Adobe for all your photos is cost. First, you need an Adobe Photography plan for about $10 per month. Then you’re on the hook for about $100 per terabyte per year in storage fees. If you’re just using this system for your mobile photos, the 20GB you get “for free” with your plan may be enough, at least for now. But if you want a unified system for all your photos, and their processed versions, you’re going to have to pay up.

Microsoft OneDrive Can Pick Up Some of the Slack

Until a colleague suggested it, I never thought about using OneDrive as a replacement for Google Photos as a place to store my images after Google canceled Drive integration. But since Office 365 users get a terabyte of storage with their plan, and OneDrive offers automatic photo uploading from your mobile device, it seemed like a good fit. I quickly enabled it and have been very pleased that images sync automatically to OneDrive.

From there, I can run Cloud Sync on a Synology NAS over WebDAV to bring a copy of all the images down. (I’m using a 5-bay Synology 1019+ for this article, but if you want to process images directly from your NAS, a 10Gbps model can provide better performance.) Having a local copy isn’t important to me just for archival purposes. It also makes it so I can catalog all my images and work on them in local photo editing tools, without having to look one place for my DSLR and Drone photos and a different place for my phone photos.

The only downside I’ve found with using OneDrive this way — aside from needing an Office subscription — is that the images aren’t as nicely organized as with Adobe or Google. They are just thrown into one (large and growing) folder. As long as you have a folder-agnostic cataloging system like Lightroom, that isn’t too painful, but if you’re looking to browse images by folder, it’s quickly hopeless. Whichever mobile app you use for syncing your images, remember to ensure it will find all your photos. If you only use your phone’s default camera app, it should work automatically. But if you also capture Raw files or use a third-party camera application, you might need to manually add the folder or folders where those images are stored.

Commercial Photo Sharing Services Like SmugMug Are Also an Option

Adobe and Microsoft aren’t alone in offering a way to automatically upload your mobile images, of course. It’s become a popular feature for other photo-sharing services, but not all of them offer good tools for downloading or syncing your library once it is on their service. One that I’ve used with success is SmugMug. The company offers a free app that allows its users ($5.99 per month and up) to automatically upload their mobile images to a gallery of their choice. You can download galleries or even sync them with Lightroom via SmugMug’s plugin.

Creating Your Own Photo Cloud Using a Synology NAS and Moments

There are a number of companies that offer you the capability to create your own photo storage cloud. For this article, I chose to use a Synology 1019+ 5-bay NAS as a reasonably priced hardware option with solid features and a 5th bay for additional expansion room. Plus, it comes with Synology’s Moments application, a modern solution for storing, organizing, and if you want, sharing your mobile photos.

You can use Moments for your existing photos as well — by moving them into the Moments folder tree — or sync the contents of your Moments folders over to the location on your NAS where the rest of your images are stored. For writing this article, I’ve chosen the latter approach for simplicity. Synology also offers a Photo Station package, which is a more traditional photo organizing system but doesn’t have the same support for mobile devices.

Moments has an Auto-enhance feature similar to Google’s Assistant that picks out what it considers some of your best images and proposes an automated color enhancement.

Organizing Your Images With Synology Moments

Like many photo organizers, Synology has jumped on the AI-powered scene, object, and people recognition bandwagon. They don’t have the R&D horsepower of a Google or Adobe to deliver the very best results, but Moments does a competent job of automatically tagging people, subjects, and places. In addition, you can enable a capability that scans your images, picks out what it thinks are your best, and suggests an automatic color enhancement.

Synology’s moments automatically tags people, subjects, and places in your images

I don’t find the results as impressive as Google Photos’ Assistant, but they can definitely bring some life to a scene without requiring any work. Moments can also find groups of similar images, so you can choose to delete some to save space. For me, disk drives are inexpensive enough that I’d rather add more storage than spend my time deleting similar images, but the option is there if you want it.

Tips for Setting Up Your Server

Whether you use a dedicated NAS like a Synology, QNAP, or Netgear, or add a drive array to a Windows or Mac server, there are some common considerations. First, I’ve grown to like the flexibility of 5-bay NAS units, compared with the more traditional 4-bay units. I can run a 3-drive RAID for my main data store, and still have two drives for a mirrored pair for storing our surveillance camera footage, for example. Second, make sure that your array can support large drives. This is especially true if you also want to use it to back up your desktop and laptop drives.

We tested the 1019+ with Seagate IronWolf and IronWolf Pro drives of various sizes, including 14TB and the new 16TB models. All worked well, providing the possibility for massive storage. For example, 3 16TB hard drives in a RAID-5 or Hybrid RAID configuration would provide almost 32TB of usable storage. Speaking of drive sizes, if you use one of the new RAID options, like Synology’s Hybrid RAID (SHR), then you can enlarge your array easily by adding more or larger drives. That’s in contrast to traditional RAID formats, which often require serious work to perform an expansion.

Picking a Strategy That’s Right for You

If you’ve gotten used to the simplicity that comes with using your smartphone and relying on a cloud connection to Google Photos to manage your images, then all of these options might seem like a lot of extra work. But for many of us, our photographs are a lifelong asset, and in some cases may be enjoyed by generations to come. So it is worth thinking about the future and what your photos and videos are worth to you before it’s too late and something happens to them.

Now Read:

November 12th 2019, 7:46 am

Google Reveals Stadia Launch Lineup of 12 Games

ExtremeTech

Rumors began circulating about Google’s game streaming service more than a year ago, and the company began actively testing its technology publicly with Project Stream in late 2018. Project Stream became Stadia earlier this year, and it’s finally set to launch. Google has been talking about games that will come to Stadia at some point, but now we have the full launch lineup. It’ll be just 12 games

Stadia is similar to GeForce Now and Microsoft’s upcoming xCloud service. Instead of downloading a game or buying a physical copy, Stadia renders the games on a Google server and streams the video down to your devices. Companies have been trying to figure this out for almost a decade, ever since OnLive began offering cloud gaming services in 2010. 

Internet access has gotten faster, but many of the fundamental issues with game streaming remain. Google might have a shot at addressing them, though. Google has massive server infrastructure, allowing it to provide low-latency streams to more locations. It also designed a custom Stadia controller that connects directly to the internet instead of bouncing through a local wireless connection like Bluetooth. Stadia plugs into Chromecasts streaming games on TVs, and many households already have one of those plugged in. 

Even if Stadia works perfectly, it won’t matter if it lacks content. The initial launch lineup has a little of everything, but the emphasis is on little. Here’s the list of games you’ll be able to buy on November 19th. 

Google has, of course, announced other games for Stadia. Anything previously announced like Darksiders Genesis and Borderlands 3 will come later. Google promises the latter will launch on Stadia in 2019 along with more titles like Rage 2, Grid, and Metro Exodus. 

Stadia launches on November 19th exclusively for players who ordered the Founder’s Edition starter kit. That comes with three months of Stadia Pro ($10 per month after), a limited edition controller, a Chromecast Ultra, and a copy of Destiny 2. The base version of Stadia, which lacks 4K support will be available early next year. That one doesn’t include a monthly fee, but you still have to pay for the games.

Now read:

November 11th 2019, 5:05 pm

ET Veterans Day Deals: Netgear Nighthawk R6700 Just $68, Apple AirPods with Wireless Charging Case $

ExtremeTech

Netgear’s Nighthawk R6700 router is one of the world’s most popular wireless solutions thanks to its solid performance and affordable price. Today you can get this already well-priced Wi-Fi router with a discount that also makes it an excellent deal.

Netgear Nighthawk R6700 802.11ac AC1750 Smart WiFi Router ($68.00)

The Nighthawk R6700 is one of the most popular Wi-Fi routers on the market. It offers reliable performance with speeds of up to 1,750Mbps across two bands. It also has built-in USB ports for adding network resources. Right now with a clickable coupon, you can get it from Amazon marked down from $84.35 to $68.00.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 11th 2019, 3:33 pm

GlassWire Lets You Secure Up to 10 PCs for Less Than $10/year

ExtremeTech

No VPN, antivirus, or anti-malware is infallible, as NordVPN recently confirmed. When even one of the most trusted, secure cybersecurity solutions on the market can’t be fully depended upon, it’s important to keep a more active view of your PC’s health.

GlassWire is a comprehensive antivirus that actively monitors up to 10 PCs with a single account, giving you a real-time analysis of what’s going on with all of your machines. This all-encompassing solution monitors your network, tracks host changes, erects a secure firewall, and displays all relevant metrics and information in a single, central hub so you’ll always have ultimate control over your PC’s security. It visualizes current and past networking activity reveals hosts that are known threats, surfaces network activity while you were away or logged out of your computer, monitors remote servers where you host websites, apps, or games, and much more. Better yet, it actively keeps you under your bandwidth usage by alerting you to all possible data overages.

Find out why PCMag wrote glowingly about GlassWire. A 3-year Elite subscription typically costs $297 but you can sign up today for just $29.99.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 11th 2019, 2:50 pm

NASA Unveils Its First Experimental Electric Airplane

ExtremeTech

The American X-plane series has a long and storied history stretching all the way back to the Bell X-1 that made supersonic flight a reality. NASA, the Air Force, and other parts of the government have used X-planes to explore the flight mechanics of vertical takeoff and landing (VTOL), movable wings, and much more. Now, NASA is working on the first manned X-plane in decades, the all-electric X-57 Maxwell. 

NASA started working on the X-57 in 2015, but it’s not building its electric plane from the ground up. The team started with a Tecnam P2006T twin-engine propeller plane, which it is modifying in stages. NASA hasn’t flown the aircraft yet, but it has deemed the X-57 ready for its public debut. The press was allowed to view the X-57 last week at NASA’s Armstrong Flight Research Center in Edwards Air Force Base. 

The X-57 is currently in its “Mod II” configuration, which is the first featuring entirely electric flight hardware. The plane has electric cruise motors where two combustion motors were in the original aircraft. Mod III and IV will complete the X-57’s transformation from a noisy combustion plane to a quieter, more efficient electric one. 

Like an electric car, the X-57 will carry a bank of lithium-ion batteries to power its motors. NASA chose to use a small propeller plane for the project because the aerodynamics are more favorable for an electric aircraft that also uses propellers. NASA doesn’t expect an electric plane like the X-57 to be as fast or have as much range as a jet, but it could be ideal for short-haul flights. 

The new X-57 wing in testing at Armstrong Flight Research Center.

NASA is currently testing the Mod III wing, which features a high-aspect-ratio design and space for additional electric propellers. With the Mod III upgrade, the current pair of electric motors will move to the wingtips. In its final configuration, each wing will have six small propellers along the leading edge along with a larger propeller at the tip. The plane will use all the motors during takeoff, but it only needs the larger ones during cruise. 

NASA hopes the X-57 project can contribute important advances to the aviation industry, but it’ll have to take to the skies before that can happen. The agency currently hopes to fly the X-57 in its final configuration later in 2020. 

Now read:

November 11th 2019, 1:32 pm

Biometric Big Brother: Streaming Services Want Thumbprint Verification for Access

ExtremeTech

The streaming industry believes it has a serious problem: Password sharing. The solution? Various draconian measures meant to ensure that the only people watching a stream are people who have paid for the privilege, up to and including mandatory biometric authentication.

Imagine Microsoft announcing that in order to prevent Windows piracy, Windows devices would now keep a log of which users had paid part of a yearly fee to access data on that particular PC or to use the operating system. Imagine if Adobe, Apple, Blizzard, Epic, Valve, or any other software ecosystem operator declared that in order to prove you were legally allowed to access data you had paid to access, you had to provide them with biometric authentication data.

The idea is to start by making it annoying for people to share passwords by requiring the use of secondary authentication methods, according to Bloomberg — periodic password changes, or using 2FA to send authentication codes to phones. The goal is to create rules that govern which devices can see streams, allowing a smartphone or tablet to view content, but, say, blocking a Roku device in-use at a second location. Of course, users might just bypass that by streaming from a phone or tablet to a TV, or finding some other way to bypass the restrictions our content overlords wish to deploy. In order to prevent that from happening, the streaming industry is willing to go nuclear.

“If none of those tactics work,” Bloomberg writes, “pay-TV subscribers could someday be required to sign into their accounts using their thumbprints.”

In the past, this kind of biometric authentication system wouldn’t even have been possible. Biometric systems have existed for years, but there was no practical way for a service provider like Microsoft, Netflix, or Hulu to demand you use one. Now that fingerprint readers have been distributed to billions of people, companies are looking for ways to capitalize on them.

This isn’t just an annoying trend. Allowing corporations to require biometrics in this fashion is exceedingly dangerous. It’s true that in theory, at least, biometric security systems could be more secure than password-based authentication could ever be. The problem is, 1). Biometric systems usually aren’t nearly as good as the companies deploying them like to claim, and 2). Once anyone steals your biometric data, they’ve got it forever. We base biometrics on unchanging elements of ourselves. You can change your password. You can’t change your thumbprint. Having biometric data isn’t the same thing as being able to penetrate a biometrically locked account — that depends on the attack vector — but it could certainly simplify the process.

History further suggests that hackers will find ways to exploit weaknesses in biometric authentication practices in order to attack accounts and steal data. There was a time when turning on 2FA and using SMS to lock down an account was widely recommended. Now, articles counsel against the practice in favor of a device like a YubiKey. If biometric data is widely stored and used by companies for authentication, some of those companies will not secure the data properly. Some of that data will be leaked or stolen. It may be used by nation-states who are targeting specific individuals and attempting to hack hardened systems or it may be used by hackers carrying out broad attacks against large groups, but it will be used.

What justification is given for throwing open the floodgates and demanding mandatory authentication of a sort once reserved for criminals and those seeking security clearances, regardless of the further long-term damage to citizen privacy and the ability of individuals to establish their own personal identities? About $15 per month.

“I feel like I’m beating my head against the wall,” Tom Rutledge, the chief executive officer of Charter Communications Inc., said during an earnings call last month. “It’s just too easy to get the product without paying for it.”

Dealing with losses caused by theft of services or product is a fact of being in business. If you manufacture something anyone wants to buy, someone is going to steal it. If you manufacture valuable intellectual property, someone is going to steal it. Millions of people stole Game of Thrones and watched it without paying HBO a dime. Is it moral or ethical to take things you haven’t paid for? No. Is it moral or ethical to deploy this kind of biometric authentication requirement across devices, running a significant long-term risk of data theft? Also no.

I would personally argue the second argument represents a vastly larger practical harm than the first, particularly given how we know large corporations treat user data and how well they secure it. At a certain point, we have to recognize that the only way to safeguard information is to limit the ability of companies to collect it in the first place.

According to Bloomberg, the various companies contemplating these schemes are aware they might prove extraordinarily unpopular with customers. Nonetheless, Netflix, Amazon, Disney, Viacom, AT&T, HBO, Comcast, and Charter are all members of the Alliance for Creativity and Entertainment, which has announced it is focusing on password sharing as a means of reducing piracy. Companies like Netflix, which has generally been tolerant of password sharing, have announced they intend to start looking for consumer-friendly ways to “push on the edges of that.”

Now Read:

Galaxy S10 Fingerprint Sensor Reportedly Thwarted By Cheap Screen Protector
Judge: Police Can’t Force You to Unlock Phone With Fingerprint or Face ID
Researchers Create ‘Master Fingerprints’ to Unlock Phones

November 11th 2019, 11:32 am

Skylum Doubles Down on AI Image Enhancement in Luminar 4 Photo Editor

ExtremeTech

We’ve written a lot about how much Adobe is now relying on the power of its Sensei AI platform, but in most cases Adobe is using it for tagging, selecting, and other tasks that help accelerate creative workflows, and not for pure image enhancement. Skylum, makers of the Luminar image editor and Aurora HDR processing tool, have in contrast gone all-in on AI-powered image enhancement. This is particularly clear in Luminar 4 ($89, available for pre-order via a special offer). I’ve been using a pre-release version for several weeks now, side by side with tools from Adobe and others, and can report that it provides an intriguing option for those looking to get results quickly without giving up the power of a full image editor.

Luminar Isn’t Photoshop or Lightroom, It’s Some of Each

Luminar fits in an interesting space somewhere between Photoshop and Lightroom. It has a non-destructive, slider-based, set of tools that work on a variety of image formats, like Lightroom. But it also has support for Layers, like Photoshop. However, you can’t go wild adding graphics and text to your image, or creating content from scratch as you can in Photoshop. And while it does have a Library module, it is not much more than a file browser with an option to create collections of images called Albums. So you can’t do all the powerful tagging and searching that you can on a Lightroom catalog (then again, you also don’t need to worry about maintaining one).

Once you’re used to adding folders to your Library, the folder system works pretty well. However, one thing that drove me nuts about the Library module is that there doesn’t seem to be any way to put basic information about each photo on its thumbnail in the Library. I get why a pretty view of your images is a lot of fun, but if you need to do serious work you often want to see the filename, date, or other key data while you are browsing. You can put information in a sidebar, but as far as I can tell it is only displayed once you click on an image.

AI Image Enhancement You Can Control

For those familiar with the nagging prompts provided by Google Photos suggesting semi-magical Automatic enhancement of your images, the concept of AI-driven image enhancement isn’t new. But features like Google’s are black boxes and very hit-or-miss about whether they will work for a specific image. Or indeed, whether Google’s computer’s creative vision for the image is the same as yours. Luminar 4 uses AI to provide the underlying framework to allow you to apply and customize a wide variety of types of enhancements, and even use contributed presets that it calls Looks.

The flagship enhancement is called, simply enough, “AI Image Enhancer.” Using it on a variety of images I found that it does an excellent job of making images more pleasing. Until now, I’ve found that DxO’s PhotoLab had the best-automated image process for 1-click image enhancement, but Luminar 4 definitely provides a competitive alternative. In addition to some hard-to-argue-with standard improvements, the AI Image Enhancer also tends to make colors richer and scenes warmer. That is a great starting point, but not for everyone or every image. It is easy to dial the effect back or click through some of the dozens of other Looks that are provided with Luminar 4.

Luminar 4’s AI Image Enhancer did in a few seconds what would have taken me a few minutes in Photoshop.

Looks are organized into groups, including Essentials, Landscapes, Street, Portrait, Lifestyle, Dramatic, Aerial, User-defined, and downloaded. Flipping through them reminds me a bit of using an HDR program on a set of bracketed images. There is usually one that looks pretty good. But if it isn’t quite what you want, you can use the editing power of Luminar to tweak it to your heart’s content. You can change the slider settings on typical image adjustments, or even add additional layers, with many of the same capabilities as you’d find in Photoshop.

I found that the Autumn Colors preset did a nice job of warming up images taken under harsh light, like this one of elephants at a watering hole in southern Botswana.

In addition to a wide variety of typical image editing tools, there are also specific tools for AI Accent, AI Sky Enhancer, and AI Structure. Now, the buzzword AI is being applied to everything, so it’s not always clear in what way each of these tools uses carefully trained neural networks or other technologies that fall under the AI rubric. But, of course, it doesn’t really matter as long as the results are what you want. In my testing, I found the AI-powered filters did a surprisingly good job of creating more pleasing versions of the images I fed them. Like with many image enhancement tools, it’s easy to overuse them and create images that are gorgeous but give themselves away as being better-than-real, so moderation is called for.

AI Sky Replacement

One gripe common to anyone who photographs outdoors is that gorgeous skies often don’t show up when you want them to. Compositing an image taken from a specific place with another of the sky from the same place on a different day is something of a time-honored tradition (although of course, the result is no longer a true photograph.) In any case, the idea of automating the process is intriguing.

A screenshot of my original image from the North Rim of the Grand Canyon with a plain blue sky

I was able to replace the solid blue sky of the image with one of the preset versions provided by Skylum.

Unfortunately, the first release of Luminar’s Sky Replacement only used their preset skies. In my mind that crosses the line from some type of photography to graphic art. I was pleased that they have now enabled the capability to use your own sky images. There is a bit of a trick to it though. It isn’t as simple as taking a second image of the same scene and letting Luminar do the heavy lifting. You need to deliberately shoot images composed of just the sky for the replacement to work (or crop an existing image to just the sky). That’s not the end of the world, but aiming up at the sky isn’t always automatic, and doesn’t always give you the perspective you want. So creating custom skies takes a little getting used to.

To use your own sky image you need to provide images that are entirely sky, not just images similar to your original that have a different sky. As an experiment, I used a sky from a sunset over Lake Michigan.

Is Luminar 4 the Image Editor for You?

My biggest gripe with Luminar 4 is that the company seems to have paused development of its cataloging system in favor of concentrating its efforts on image enhancement tools. So if you’re looking for something to replace Lightroom for cataloging your images, you’ll probably find the Library module of Luminar too limited. If you wind up keeping Lightroom as your cataloging system, but still want to take advantage of Luminar’s features, the company provides a plug-in for both Photoshop and Lightroom Classic, which is installed automatically when you install the main product.

Luminar 4 is available for pre-order prior to when it ships on Nov. 18 for $89 in a special offer that includes a variety of Look presets and a 1-year SmugMug membership.

[Image credit: David Cardinal]

Now Read:

November 11th 2019, 9:59 am

Enthusiast Claims His Power Plan Boosts AMD Ryzen Performance. We Investigate.

ExtremeTech

An enthusiast is claiming that his own homemade power efficiency algorithm can dramatically improve performance on AMD Ryzen CPUs, over and above the default algorithms AMD has itself distributed.

The idea of unlocking hidden performance with a quick software mod is intrinsically attractive. Moreover, the enthusiast side of the PC industry has a long history of uncovering useful tricks and tips like this. From third-party utilities that offer access to various GPU features like Nvidia Inspector, to the old third-party utility Rain that enthusiasts could use to lower CPU temperatures and reduce power consumption, ordinary PC users have often created projects that extended or improved the capabilities of various products.

The 1usmus Power Plan

Ryzen enthusiast 1usmus has written an article at TechPowerUp claiming to improve the performance of AMD’s CPUs through a little manual adjustment to the company’s power plans.

When I reached out to AMD, however, the company representatives we spoke to were uncertain this type of manipulation could pay dividends. AMD’s perspective is similar to my own. I’ve taken a few cracks at trying to boost Ryzen performance through UEFI adjustments and have generally found the effort not to be worth it. Overclocking headroom on Ryzen is very limited and the CPU has generally seemed to perform best if left to its own devices.

But the entire point of a site like this is to explore what kinds of performance might be available at the cutting edge. I fired up the Ryzen 9 3900X on our MSI X570 Godlike with the 7C34v16 UEFI, a fresh, fully patched installation of Windows 10 1903, the latest AMD drivers (1.9.27.1033), and a Thermaltake Floe Riing RGB 360 cooler. While this cooler is from 2017, it’s a large radiator design with space for three 120mm fans. We’ve relied on it for multiple generations of Threadripper and Intel HEDT testing, and it can stand up to a 12-core Ryzen 3950X with no problem. Maximum reported CPU temperature from the 12-core was 67C in Ryzen Master under sustained load during the Blender 1.0 Beta 2 benchmark.

The Windows 10 scheduler has some known shortcomings with regards to how it treats CPUs with multiple NUMA nodes (we’ve discussed some of these with relation to the 2990WX, which is severely affected in some workloads). 1usmus spends some time cataloging various issues with the Windows 10 scheduler, some of which he believes his own power plan can solve.

1usmus claims:

My approach to solve this deficit in the Windows Scheduler is to use a customized power profile that provides better guidance to the scheduler on how to distribute loads among cores. This should put load on the best cores (which clock higher) and result in higher and smoother FPS because workloads will no longer bounce between cores.

This should be a fairly straightforward set of improvements to test. He claims gains of up to 200MHz in highest observed CPU frequency. He claims an improvement of ~1.03X in Cinebench R15 multi-threaded based on actual benchmark results and writes:

Whilst AMD tried to address deficiencies in Precision Boost behavior by dialing up maximum boost frequencies by roughly 50-100 MHz with AGESA 1.0.0.3 ABBA, this modification has managed to increase maximum observed core frequencies by 200 MHz on average, which should result in higher performance across the board, especially for less-parallelized workloads.

The Test

1usmus writes that the following settings must be changed in order for his power plan to work properly:

We performed these modifications and fired up the test platform. They definitely change the behavior of the CPU, but it’s not clear they do so in a helpful manner.

Under AMD’s default power plan, most CPU cores remain awake while the chip is running a workload like Cinebench, though they remain at low clocks.

Default core behavior, Ryzen 9 3900X.

Under 1usmus’ plan, these cores mostly sleep. There’s a definite trade-off here between system responsiveness and lowest-possible power state. Putting 10-11 of the other CPU cores to sleep during a workload does allow for full power to be diverted to a single CPU core. We do, in fact, see some evidence that this allows the CPU to hit very slightly higher boost clocks in single-threaded workloads.

The 1usmus power plan. Note the number of cores asleep.

In Cinebench R20, for example, Core 05 of our 3900X tends to run at ~4.46GHz when using AMD’s Ryzen Balanced Plan. Under 1usmus’ plan, that CPU core tends to run very slightly higher, at around 4.53GHz. This represents roughly 1.5 percent more clock speed than previously available.

The moment of handoff between C04 and C05. C05 is still the preferred core for ST workloads even under the 1usmus plan.

1usmus also states that his power plan results in workloads being matched more frequently to the highest-overclocking core as identified by Ryzen Master. We saw no evidence of this. Windows 10 clearly prefers running workloads on our Ryzen’s C05 core (as identified in Ryzen Master). If anything, it preferred this core more strongly when running 1usmus’ power plan than the AMD default.

Cinebench R15 performance did not improve in multi-threaded and only increased by 1.5 percent at most in single-threaded. Maximum clock speed recorded increased from 4.49GHz to 4.6GHz, but the actual degree of performance improvement was a fraction of this. Single-threaded performance improved from a score of 209 to 2012.

Cinebench R20 multi-threaded performance did not improve at all. Single-threaded performance improved by roughly 3 percent, from 512 to 526.

Finally, we ran the Barbershop_Interior benchmark from the Blender 1.0Beta2 benchmark. This is a sustained all-core test that gives the CPU more time to show the impact of any performance improvements or subtle clock changes. Barbershop_Interior render times improved from 13.35 minutes to 13.26 minutes, a gain of 0.04 percent, or indistinguishable from random error and noise.

Results

1usmus’ power plan does not appear to increase performance in any way we can separate from a margin of error, though there may be a very small (1.5 percent – 3 percent) improvement in single-threaded tests. I typically assume a 2.5 – 3 percent margin of error for any benchmark, however, and the best ST gains we measured were in 3 percent territory. There’s no sign of any improved core parking behavior; C05 is the preferred core for single-threaded workloads over time, and the 1usmus plan does nothing to change this. It appears that whatever performance the power plan secured comes from shutting down the rest of the chip more aggressively. It takes longer to wake cores from deep sleep, so there’ll be trade-offs in any case.

The truth is, as power consumption and schedulers continue to both become more complex, the chances that an end-user will magically stumble on a combination of settings that magically boost performance are low. The 1usmus’ plan does hit superficially faster clocks, but it doesn’t translate them into sustained improved performance.

ExtremeTech’s guidance is to stick with AMD defaults in all regards as it relates to power management and CPU clocking. I’ve yet to test any combination of motherboard settings that delivers consistently higher performance that didn’t amount to just overclocking the CPU — and even then, frankly, I haven’t seen OC results impressive enough to justify the effort on top-notch chips. AMD has gotten very good at squeezing out all available performance from its silicon.

Now Read:

November 11th 2019, 9:28 am

At a Glance: Netgear XRM570 Nighthawk Pro Gaming WiFi Router and Mesh WiFi System Review

ExtremeTech

Setting up a home Wi-Fi network can be a quick and easy experience, but unfortunately, sometimes it just isn’t that simple. You may encounter dead zones in your home that aren’t getting good reception due to interference or your signal simply not reaching far enough.

Netgear’s XRM570 Nighthawk Pro Gaming WiFi router is a new high-end networking system that tackles this problem. The router comes with an EX7700 range extender, which allows you to set up a mesh networking environment to extend your network coverage over a larger area.

Specs & Design

The main unit in this networking system is the Netgear XR500. Don’t get confused; this review is discussing Netgera’s XRM570 mesh networking system, but the XRM570 comprises two existing Netgear products including the XR500. Technically, if you already own an XR500 router, you could buy the EX7700 range extender and set up the exact same network.

The XR500 router itself is fairly powerful with support for a dual-band 4×4 802.11ac Wi-Fi network. The router also has a dual-core 1.7GHz processor and 512MB of RAM to help it process Internet traffic and manage network resources.

The Netgear EA7700 is also a relatively high-end networking solution. Its peak bandwidth is somewhat lower than the XR500, but it can operate on three network bands simultaneously. Using both devices together, you can cover an area up to 5,000 square feet.

Benchmarks

Our sister site PCMag tested one of these networking systems against competing solutions from Nokia, Merku, and TP-Link. Starting things off with a 5GHz throughput test, the Netgear XRM570 returned strong performance results. At a distance of 30 feet, it beat out all of the other tested products by a wide margin.

The Netgear XRM570 also led the pack in the next serious of tests. It was almost tied by the TP-Link Deco M9 Plus when tested at 30 feet, but at close proximity, none of the competition stood a chance.

Conclusion

Currently, you can get Netgear’s XRM570 system from Amazon for $374.19, which is down slightly from its MSRP of $399.99. But this networking solution appears to offer excellent performance from the test results, and it also can cover a wide area, which helps to eliminate dead zones around your home.

In general, I would recommend it as an excellent networking solution, but it should also be noted that routers that support the new 802.11ax Wi-Fi standard are already available on the market. You may not have a computer that supports 802.11ax yet, but in the near future you might, and it may be better to consider a router that supports that new technology instead.

Now read:

November 11th 2019, 8:58 am

Dell’s Alienware Laptop GPU Upgrades Are a Conceptually Great Idea

ExtremeTech

Dell is now shipping GPU upgrades for the Alienware Area-51m that it launched earlier this year, making good its promise to provide upgrades for the laptop. The general inability to upgrade a laptop’s GPU is one of the most significant intrinsic weaknesses of laptops compared with desktops, at least where gaming is concerned.

While laptops are less suited to gaming than desktops for a number of reasons, most of these can be compensated for, particularly in 2019. Want a large display? Hook up a desktop monitor via DP or HDMI. Want a large built-in display and more room for cooling the CPU? Desktop replacements are your friend. But GPU upgrades, even for DTRs, have been vanishingly few and far between.

The Dell upgrade modules are built in the company’s proprietary DGFF form factor. In this case, a proprietary form factor isn’t really a problem. Laptop GPUs have never conformed to a single board design the way that desktop GPUs have, which is one of the reasons why there’s never been an upgrade market for these parts. There’s no guarantee that two Dell laptop GPUs built for two different laptops will fit in the others’ chassis, even if the edge connectors use the same electrical standard. The price of the service includes professional installation; Dell isn’t willing to leave this process to chance.

The RTX 2070 and 2080 GPUs that the company is offering are intended as upgrades for the RTX 2060 and RTX 2070 models that have previously been sold. Here’s where things go a little off the rails. The RTX 2070 GPU is currently priced at $839, on sale from $1,039, while the RTX 2080 is priced at $1,139, down from $1,639.

That’s rather a lot of money, to put it mildly. The pricing isn’t as nuts as it first looks, however — not if you consider the price of buying a new Alienware Area-51m in the first place. The least expensive Area-51m starts at $1,999 and the $1,100 price for that upgrade RTX installation service is actually equivalent to the price of upgrading to an RTX 2080 when you buy the laptop today. In other words, if you’re an Area-51m owner who bought a 1660 Ti earlier this year and wants to step up to the RTX 2080 flavor now, Dell won’t charge you more to perform the upgrade than they would’ve charged you to install it at the factory — at least, not while this introductory pricing lasts.

No matter how crazy you think the price of the GPU is — and compared with the price of the RTX 2070S (RTX 2080-equivalent) it’s pretty crazy — it’s still significantly less than the top-end price of the laptop itself. The Alienware Area-51m’s baseline configurations range from $2,000 to $4,000, with a top-end price of around $4,600. It could theoretically be much less expensive to upgrade that laptop at, say, the three-year mark and keep using it another few years after that as opposed to buying an all-new system. Given that this is a capability being marketed to people buying high-end laptops, we have to evaluate it in those terms.

This is Dell’s own image for the “Unprecedented Upgradability” section of its web page. I’m amused because the GPU is the one component not depicted accurately (obviously the chip needs a board mount of some sort). Image by Dell.

The fact that Alienware systems support the Alienware Graphics Amplifier also takes some of the sting out of this type of positioning. It’s possible to hook an external graphics chassis to an Alienware system to tap the power of a desktop GPU, and the adapter is significantly less expensive, at $177. In other words, there are much more affordable ways to add graphics capability for owners who don’t mind moving the horsepower to a location outside the system.

The flip side to all of this is that it’s rather nuts to pay $1,140 for an RTX 2080 if you already own an RTX 2060 or 2070. Frankly, it’d be pretty nuts to pay that much money to upgrade from an RTX 1660 Ti to an RTX 2080. But the first run of GPU upgrades for this hardware family was always going to be the weakest upgrade tier. What matters far more is whether Dell continues to put effort into the program in the first place.

I’m not sure whether that will happen. OEMs aren’t particularly known for investing in their product families long-term, particularly when it comes to creating a market for an entirely new type of product. It could take a few years for a program like this to show benefits — and Dell might not want the program to work particularly well. It’s a vastly better deal for the end customer to buy a new GPU for $1,700 as opposed to a new top-end laptop for $5,000. Is it a better deal for Dell? That’s a different question.

In an ideal world, this would be the first step in rolling out upgrade options for all Alienware laptops, creating a major market differentiation for Dell gaming that other boutiques don’t offer and a meaningful value-add for Alienware boutique gamers. Whether it plays out that way, we’ll have to wait and see.

Now Read:

November 11th 2019, 8:14 am

Vitamin E Found In All Samples of Lung Tissue Taken from Injured Vapers

ExtremeTech

Mysterious lung illnesses have appeared in 49 different states over recent weeks, and authorities quickly linked them to e-cigarette use or “vaping.” The CDC has been urging consumers to steer clear of vaping while it investigated, and there’s been a break in the case. The CDC reports that all samples of damaged lung tissue contained traces of vitamin E acetate, and that’s something that should not be in your lungs. 

Since the CDC began investigating the outbreak of “e-cigarette, or vaping, product use associated lung injury” (EVALI) in August, doctors have reported over 2,000 cases. Authorities now believe EVALI has contributed to several dozen deaths. The agency received bronchoalveolar lavage (BAL) fluid samples from affected patients in 10 different states. It says all those samples contained vitamin E acetate, which is an additive in some e-cigarette products. 

Vitamin E is a group of eight similar compounds that are a necessary part of the human diet. You’re supposed to get between 7 and 15 mg of vitamin E, and most people far exceed that. In the body, vitamin E acts as a powerful antioxidant that protects your cells from free radicals. Taking a pill or using a lotion with vitamin E is fine, and no one should stop doing so on account of the recent lung disease outbreak. 

As the CDC’s Dr. James Pirkle explained at the latest news conference, vitamin E doesn’t belong in your lungs. It’s “enormously sticky,” and could contribute to the type of damage associated with EVALI. The exact mechanism remains unclear at this time. The researchers are also careful to point out that there could be other compounds at work here. The CDC did look for other theorized contaminants, but detected none. 

Vaping-related lung damage has appeared in every state except Alaska.

The CDC previously reported that most patient samples showed the presence of THC, the active chemical in cannabis. According to updated guidance, 82 percent of the vaping products tested contained THC, while 62 percent contained nicotine. In addition to the general e-cigarette guidance, the CDC says people should not use THC vaping products, particularly those purchased from online sources or obtained from friends. Vitamin E oil may be added to illicitly manufactured THC vape products as a thickening agent. There are legally produced THC vaping products in some states, but better to be safe than sorry. For now, the investigation continues, but the discovery of vitamin E in all lung samples is a significant breakthrough.

Now Read:

November 8th 2019, 7:08 pm

NASA Installs Final RS-25 Space Shuttle Engine on SLS Core

ExtremeTech

NASA is heading back to the moon, and it’s planning to use the long-delayed Space Launch System (SLS) to get there. The agency is working to assemble the first SLS rocket, which will be the most powerful in the world upon completion. Some of that power will come from four RS-25 engines on the core stage. If they look familiar, that’s because the RS-25 has a storied history in NASA’s Space Shuttle program, having first debuted in the 1970s. Now, NASA has just finished installing them on the SLS. 

The SLS will be NASA’s new super heavy-lift launch vehicle, the first it has had since the Saturn V program ended. Unlike SpaceX’s Falcon 9 and Starship platforms, the SLS is an expendable spacecraft. That means NASA needs a new rocket for each launch — only the Orion crew module will return to Earth for refurbishment and reuse. There are reports that each SLS launch could cost $2B, if not more. NASA has not formally addressed these reports, but it hasn’t refuted them, either.  

To get the SLS to distant locales like the Moon, Mars, and Jupiter, it will use a pair of massive solid rocket boosters alongside the four RS-25 liquid fuel engines. In the interests of time and cost savings, NASA is using engines that previously flew as part of Space Shuttle missions. Each of the first four SLS engines has its own backstory. 

The first one, which NASA attached last month, is Engine 2056. That part rode into space with the shuttle Discovery for the first launch following the Columbia disaster (STS-114). It was also on Atlantis for the final mission of the Shuttle program (STS-135). Engine number two is E2045, which NASA used on Atlantis from STS-110 until the spacecraft’s retirement. It was also used on Discovery for STS-121, so the Artemis I launch will be a reunion of sorts. 

The SLS got E2058 installed just a few days ago. This RS-25 engine was a Discovery engine for many flights starting with STS-116 and ending with Discovery’s last flight (STS-133). The final of the four engines, which NASA just attached to the SLS, is E2060. This is the newest of the engines, having only flown three times on three different Shuttles: Discovery, Endeavor, and Atlantis. 

The RS-25 was designed by Aerojet Rocketdyne to be a reusable system. The SLS, however, is not reusable. These engines will end up in the ocean after the Artemis I flight, currently on the books for 2021. NASA expects to have a cheaper non-reusable version of the RS-25 ready in the mid to late-2020s once its supply of old Shuttle engines runs dry.

Now Read:

November 8th 2019, 5:08 pm

Become an Android App Developer with This $29 Course Bundle

ExtremeTech

The mobile environment is always changing and is growing rapidly. Android phones make up a majority of the market share of smartphones around the world, so why not brush up on your Android development skills to make sure you’re ahead of the curve? With the Android Jetpack and App Development Certification Bundle, you’ll be able to enrich your existing skills as well as learn some new ones.

This bundle comes packed with five different courses to go through and develop your skills. Each of those courses has well over 35 lessons too, so if you think you know it all, think again. The bundle consists of 41 hours worth of content in total, containing everything you’ll need to know about Kotlin and Java, and how to make a fully functional app that is use-able in the real world. In fact, once you’re done going through these lessons, you’ll be able to build multiple functional apps that function similar to some larger apps that people know and love, like Twitter, Tinder, and more.

The Android Jetpack and App Development Certification Bundle is on sale for $29 right now, saving you 97 percent off the cost to take each course separately. Be sure to purchase while you can and take advantage of a great deal to become an awesome Android developer.

November 8th 2019, 5:08 pm

ET Early Black Friday Deals: Save on Top Noise Cancelling Headphones from Bose, Sony, and Beats

ExtremeTech

As we move through November and Black Friday draws ever closer, we are seeing an increase in the number of deals available. Several items like high-quality noise cancelling headphones from Bose, Sony and Beats are available with large discounts, and you can also save on several computers and computer components.

Featured Deals

Check out more deals from TechBargains.

Amazon Devices

More Amazon Device Deals here.

Apple Devices

More Apple Deals here.

Laptops

More Laptop Deals here.

Desktop Computers

More Desktop PC Deals here.

Monitors

More Monitor Deals here.

Networking, Storage and Components

More Networking, Storage and Component Deals here.

HDTVs & Home Entertainment

More TV Deals here.

Electronics

More Electronics & Tech Deals here.

Headphones, Speakers & Audio

More Headphone and Audio Deals here.

Tools & Home Improvement, Kitchen Gadgets, and more

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

 

November 8th 2019, 5:08 pm

OpenAI Releases Fake News Bot It Previously Deemed Too Dangerous

ExtremeTech

Machine learning , artificial intelligence, ai, deep learning blockchain neural network concept. Brain made with shining wireframe above multiple blockchain cpu on circuit board 3d render.

In February of this year, the nonprofit artificial intelligence research lab OpenAI announced its new algorithm called GPT-2 could write believable fake news in mere seconds. Rather than release the bot to the world, OpenAI deemed it too dangerous for public consumption. The firm spent months opening up pieces of the underlying technology so it could evaluate how it was used. Citing no “strong evidence of misuse,” OpenAI has now made the full GPT-2 bot available to all

OpenAI designed GPT-2 to consume text and produce summaries and translations. However, the researchers became concerned when they fed the algorithm plainly fraudulent statements. GPT-2 could take a kernel of nonsense and build a believable narrative around it, going so far as to invent studies, expert quotes, and even statistics to back up the false information. You can see an example of GTP-2’s text generation abilities below. 

You can play around with GPT-2 online on the Talk to Transformer page. The site has already been updated with the full version of GPT-2. Just add some text, and the AI will continue the story. 

The deluge of fake news was first called out in the wake of the 2016 election when shady websites run by foreign interests spread misinformation, much of which gained a foothold on Facebook. OpenAI worried releasing a bot that could pump out fake news in large quantities would be dangerous for society. Although, some AI researchers felt the firm was just looking for attention. This technology or something like it would be available eventually, they said, so why not release the bot so other teams could develop ways to detect its output. 

An example of GPT-2 making up facts to support the initial input.

Now here we are nine months later, and you can download the full model. OpenAI says it hopes that researchers can better understand how to spot fake news written by the AI. However, it cautions that its research shows GPT-2 can be tweaked to take extreme ideological positions that could make it even more dangerous. 

OpenAI also says that its testing shows detecting GPT-2 material can be challenging. Its best in-house methods can identify 95 percent of GPT-2 text, which it believes is not high enough for a completely automated process. The worrying thing here is not that GPT-2 can produce fake news, but that it can potentially do it extremely fast and with a particular bias. It takes people time to write things, even if it’s all made up. If GPT-2 is going to be a problem, we’ll probably find out in the upcoming US election cycle.

Now read:

November 8th 2019, 10:00 am

AT&T Switches Customers to More Expensive Plans Without Permission

ExtremeTech

AT&T is never shy about finding new ways to screw with its customers. While this trend applies to all of the carriers in the United States, it’s particularly noteworthy that AT&T is rolling out these new plans in the very same week it paid a $60M fine to the FTC for lying to customers about the nature of supposedly unlimited data plans.

According to a new AT&T document, mobile users with old Mobile Share plans now enjoy 15GB more data than they previously had. Mobile Share Value 30GB plans are now Mobile Share Value 45GB plans. Mobile Share Value 60GB is now Mobile Share Value 75GB, etc, etc. Sounds like a nice bonus, right?

Not so much. According to AT&T: “Enjoy more data. Starting with your October 2019 bill, you’ll get an additional 15GB of data on your Mobile Share plan. This bonus data comes with a $10 price increase.”

At first glance, this might seem to fly in the face of what “bonus” means. Typically, a bonus is understood to be a reward, be it tangible or otherwise, for a job well done. Most of us are familiar with the idea of a bonus as something one earns — something that represents a positive value.

But look at the first definition listed in Dictionary.com, and it all makes sense.

bonus (noun): “something given or paid over and above what is due.”

What did AT&T customers previously pay for their service plans? Between $100 and $225 dollars per month. What do they pay now? $110 – $235 per month. Is this not the very definition of the word bonus? AT&T customers are now giving additional funds, over and above what was previously due.

There is no way to opt out of the increase and it’s not even the first time AT&T has raised prices by $10 this year; the company targeted another set of plans for increases earlier in the year. Giving people with 20-70GB of mobile data an additional 10GB in the first place is a worthless value to AT&T because it likely isn’t supplying that much data in the first place. These changes hit plans that have been in place since 2013 and aren’t offered to new customers any longer. If you’re still using a mobile data plan from 2013, it’s probably because it suits your needs just fine — which means you didn’t need the data in the first place.

I can’t say I’m surprised. AT&T is perfectly happy to lie to its customers about the network they’re actually using. The company has rebranded its LTE service as “5G E” in select cities, despite offering nothing of the sort. Then again, considering the state of 5G, that might actually have been a mistake.

Why is AT&T doing this? Because it can. Because the FTC is willing to sign off on pitiful fines that allow AT&T to keep hundreds of millions of dollars in revenue that it earned by lying to customers about the service it provided them. Because the FCC and DOJ are willing to sign off on mergers that continue to shrink the number of carriers providing service in the US, even though US cellular subscription costs are already far more expensive than any comparable nation on Earth.

Enjoy your bonus.

Now Read:

November 8th 2019, 9:46 am

High Display Brightness Mode Discovered Hidden in Pixel 4

ExtremeTech

Google launched its fourth-generation Pixel phones a few weeks ago, and opinions have been mixed so far. While Google’s photography prowess cannot be ignored, it also chose to put smaller batteries in these phones than you’d find in many competing devices. Some Android aficionados have wondered if Google lowered the display brightness to compensate for the battery, and lo and behold, there’s a high-brightness mode hidden in the settings

Before release, the Pixel 4 XL found its way to DisplayMate, which tests the OLED screens on many high-end phones. Its report was overall very positive, giving the Pixel 4 XL an A+ rating. However, Android fans are tough to please. The middling battery life of the smaller Pixel 4 opened the door to criticizing the merely average display brightness. 

The Pixel 4 and 4 XL use OLED panels at 1080 x 2280 and 1440 x 3040, respectively. Both support 90Hz refresh rates, and the color accuracy is as good as any Samsung phone on the market. These phones reach about 450 nits for full-screen brightness, which isn’t impressive compared to the current market leader Galaxy Note 10 that can manage 793 nits peak brightness in full screen. You’ve probably seen higher numbers, but that’s because many OLED brightness measurements are taken with just a small part of the display illuminated. Despite this, DisplayMate says the Pixel 4 has above average outdoor readability because of its low panel reflectance. 

Regardless, the quest for a brighter Pixel display led XDA to discover the hidden High Brightness Mode. You need root access on the phone to access this feature, but that’s not terribly difficult on Pixel phones. With root, you can enter a shell command to enable the “hbm_mode” for the OLED. That pushes the brightness from 450 to over 600 nits. With the low-reflectance properties of the display, this makes the Pixel 4 ultra-readable in direct sunlight. It will, no doubt, affect your battery life. 

This root method accesses a feature Google never intended to make available — every phone manufacturer uses firmware to manage brightness levels, and this circumvents the default level. The phone will disable High Brightness Mode when the display turns off, so you’d need an automation app like Tasker to enable it again each time you use the device. That’s the sort of thing someone who cares deeply about display brightness would do.

Now Read:

November 8th 2019, 7:58 am

AMD’s 16-core Ryzen 9 3950X, 32-core Threadripper 3970X Available November 25

ExtremeTech

AMD has had a great 2019, but the year isn’t over yet. Team Red will deliver a pair of upgrades for the desktop CPU market in a few weeks with the launch of its 16-core Ryzen 9 3950X (first teased last summer) and the upcoming 32-core Threadripper 3970X and 24-core Threadripper 3960X. There’s also a brand-new platform for the new Threadripper CPUs, and a low-cost Athlon part for $50.

Let’s round it all up.

Ryzen Hits 16 Cores

First up, the Ryzen 9 3950X, arriving for review on November 14 with retail availability on November 25. This CPU was advertised in June for fall availability, only to be delayed as AMD struggled to keep Ryzen on store shelves. The new chip will price at $750, a significant increase over AMD’s 12-core family. It’s clear that 12-core is meant to be the sweet spot at $500 and below. The jump from 12 to 16 cores is a 1.5x price increase for a 1.33x core count improvement. This isn’t bad by the standards of top-end parts, but it’s definitely a halo part. AMD may honestly have positioned the chip like this to keep demand tamped down. Both the Ryzen 9 3900X and the Ryzen 9 3950X use two 7nm die, but the 3950X requires that both dies be fully functional. With the 3900X, AMD can use die recovery to fuse off bad CPU cores and still sell the processor.

AMD’s proposed price stack and product mix for the future. We should note that this chart is only valid until Cascade Lake hits the market, which is when Intel’s Core i9-10980XE will drop to a $999 price point and offer better price/performance ratios against the Ryzen 9 3950X than the current Core i9-9920X. AMD is claiming some decisive wins against the competition.

The Ryzen 9 3950X may only have a 105W TDP, but AMD is recommending heavy-hitting cooler. The CPU will not ship with one, but customers are recommended to use a 280mm AIO liquid cooler, if not something even larger. The company is also introducing a new “Eco Mode,” which will allow CPUs to operate as if they were one TDP bracket lower than otherwise.

Athlon 3000G: AMD’s New Low-Cost APU

AMD is also announcing a new APU today, at the low cost of $49. The new 2C/4T CPU is still based on 12nm Zen+ silicon (no luck for those hoping this was a stealth Zen 2 chip). The new CPU picks up +300MHz compared to the Athlon 2000GE with the GPU grabbing another 100MHz. Both gains should be significant — a 2C/4T core can use the boost, and the modest Vega 3 GPU is still going to scale from clock speed.

AMD claims the 3000GE is a very solid comparison against the Pentium G5400 at $73, with substantial performance improvements in multiple applications. As a low-end APU it should be a solid part for customers with very light computing needs. The chip is also overclocked for anyone who wants to try and pick up a bit more performance on the cheap.

Ryzen Threadripper 3970X: Killer Performance, New Platform

Finally, there’s Threadripper. The Threadripper 3960X is a 24C/48T part, with a 3.8GHz base clock, 4.5GHz boost clock, 140MB of onboard cache, and a $1400 price tag. The 3970X is a 32C/64T part with a 3.7GHz base clock, 4.5GHz boost, 144MB of cache, and a $2000 price tag. The claims AMD is making for performance are substantial, to say the least.

Last year, AMD launched the 32-core Threadripper 2990WX, but the chip’s Windows 10 performance was throttled by deficiencies baked into the Windows 10 scheduler and how it assigned workloads to the core. The architectural and structural changes made in the shift from TR2 to TR3 should obviate these problems and allow the CPU to hit its full potential, performance-wise. AMD’s $1400 price point on the 24-core chip is a 1.5x increase in cores for a 2x increase in price, while the 32-core CPU is a 2x increase in cores for a 2.67x increase in price. Intel’s Cascade Lake being priced at $999 makes it a well-positioned upgrade relative to the 9900KS (roughly a 2x increase in cores for a 2x increase in price), but this is highly unusual for Intel. The Core i9-9980XE, for example, is still a $1900 – $2000 CPU — 2.25x more cores than the 9900K, for 4x more money.

The 7nm chips inside Threadripper use symmetrical configurations without dummy die in either a 6+6+6+6 or 8+8+8+8 configuration. The unified I/O die and memory controller design avoid the NUMA issues that plagued the 2990WX and should neutralize the core contention problems that limited performance on that CPU under Windows 10 (the 2990WX was a much better chip under Linux than in Windows, even with Core Prio). Mem

AMD is moving to position Threadripper in a leadership position to maximize its CPU revenue and drive greater adoption of its chips in high-end and boutique workstations that use these sorts of chips. At the same time, it’s brought a 16-core to desktop to extend the core count of its mainstream CPUs. If you bought a first-generation Ryzen in 2017 on an X370 motherboard, you likely now have the option to step up to as many as 16 CPU cores and a significant uplift in CPU IPC after less than three years.

All of that performance requires a new platform, and AMD’s TRX40 is here to foot the bill. TRX40 supports up to 256GB of RAM using 32GB DIMMs and it significantly expands overall platform bandwidth. There are 64 lanes of PCIe 4.0 bandwidth in total, divided up as follows: 48 lanes for general chipset use (GPUs, accelerators), 8 lanes for chipset downlink (fixed), and then a pair of options for the eight lanes remaining.

Think of those last eight ports as being divided into two quads of four ports each. Motherboard vendors can wire up a PCIe 4.0 x4 slot, an NVMe PCIe 4.0 x4 slot (for SSDs), or attach four SATA ports. This “Pick One” option is available for both quads, meaning that a board vendor could choose to have four additional SATA ports attached to one set of four PCIe 4.0 lanes while using the other quad for an NVMe x4 slot. In this case, those extra SATA ports would be hanging off the CPU directly rather than through the chipset.

One of the reasons why AMD has a new socket on TRX40 is because it wanted to increase platform chipset bandwidth. Up until now, the northbridge and southbridge have been linked by a PCIe x4 3.0 link. With TRX40, AMD is adopting a PCIe 4.0 x8 link, effectively quadrupling bandwidth. This is the equivalent of a full-sized x16 GPU slot being used for communication between the CPU and its chipset.

TRX40 is not forwards or backwards compatible with AMD’s previous TR4 socket or motherboards. While the CPUs look physically identical and the sockets are the same, a 1000-series or 2000-series Threadripper will not accept a TRX40 CPU or vice-versa. AMD’s TRX40 is designed to be flexible, with a great deal of potential customizability. There was no mention of a “TRX80” as has been rumored.

Conclusion

Overall, this is a momentous set of launches for AMD. The lack of upgrade compatibility for first and second-generation Threadripper owners is an unfortunate loss, but the 32-core Ryzen 9 3970X looks as though it could rewrite the rules of high-end desktop performance. The 16-core Ryzen 9 3950X at $750 is priced at a premium above AMD’s other cores, but AMD was always likely to establish a high-end market for its chips. Intel has done no different for decades, and AMD has played a critical role in making desktop computing cheaper over the past 2.5 years. At the same time, the company wants to establish itself in the kind of markets where OEMs don’t even talk to you unless you’ve got a multi-thousand dollar CPU to field. That means establishing Threadripper as a player in higher-end markets.

There were consistent rumors of a 64-core Threadripper emerging this generation, but no such chip has tipped up and AMD hasn’t said anything about confirming one. It’s not surprising that AMD would want to keep its halo CPU core counts for the server market. Intel has higher core count Xeons coming next year, with 38-core Ice Lake server parts and 48-core Cooper Lake socketed chips coming as well (built on 10nm and 14nm, respectively). With socketed 48-core chips coming, AMD may be waiting to see how Intel introduces these chips and where it positions them before making any moves of its own. Alternately, it may simply plan to keep Threadripper as a 32-core platform for now, or to launch a 64-core chip at a later date.

Now Read:

November 7th 2019, 4:34 pm

ET Deals: Ring Video Doorbells on Sale with Free Echo Show 5, $200 off Apple MacBook Pro, Up to 50 P

ExtremeTech

Keeping your home safe from unwanted intruders is a complicated task, but by purchasing one of Ring’s Video Doorbell devices you can keep a video record of anyone coming to your front door. For a limited time, you can get a Ring Video Doorbell device with an Echo Show 5 and a $50+ discount.

Ring Video Doorbell Pro + Echo Show 5 ($189.00)

This helpful device is more than just a doorbell. It sends notifications to your Echo devices whenever the doorbell is rung, and it will allow you to see who is at your door and talk to them. This helps to keep you safe by making it easy to check who’s there. As a special deal, Amazon has bundled its Echo Show 5 with the Ring Video Doorbell Pro, which acts as a display for the camera. Right now it’s marked down from $249.99 to 189.00 from Amazon.

Apple MacBook Pro Intel Core i5 13-Inch Laptop w/ 8GB RAM and 128GB SSD ($1,099.99)

Apple’s aluminum-clad MacBook Pro comes sporting a quad-core Intel Core i5 processor that operates at speeds up to 3.9GHz. The system also has an integrated Intel Iris Plus Graphics 645 iGPU, which is among the most powerful iGPUs on the market today with sufficient power to run some modern games with low settings. Currently, you can get it from Amazon marked down from $1,499.00 to $1,199.99.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

 

November 7th 2019, 4:34 pm

Get 110+ Hours of Cisco Networking & Cloud Computing Training For A Price You Pick

ExtremeTech

If you’ve ever wanted to get started down an IT career path, or you just want to brush up on your networking and cloud computing skills, you might want to check out the Cisco Networking and Cloud Computing Certification Bundle. This bundle comes complete with instruction on everything you’ll need to get started as well as some of the more intricate processes involved—all for a price you get to pick.

The bundle itself includes roughly 117 hours worth of content that you’ll have access to for a whole year. You’ll be able to study to become Cisco-certified and ultimately further your career path. IT professionals with Cisco network administration skills tend to earn higher salaries than those without, so once you get through these courses you’ll be well on your way. In addition to the routing and switching courses, you’ll also learn about implementing Microsoft Azure infrastructure solutions, as well as going through some graphical network simulator training.

The best part about this deal is the “pay what you want” format. If you pay above the current average price, you take home the whole bundle. If you pay below, you still get something great. However, if you beat the leading price, you’re also entered into a giveaway. So, grab this Pay What You Want: Cisco Networking and Cloud Computing Certification Bundle while you can and start brushing up on your skills.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 7th 2019, 4:34 pm

T-Mobile’s Nationwide 5G Rollout Will Start on December 6th

ExtremeTech

T-Mobile has been talking a big game when it comes to 5G, but we haven’t seen what it can really do yet. That’ll change next month, though. The carrier will light up its nationwide 5G network on December 6th, promising coverage for 200 million Americans in over 5,000 cities and towns at launch. It’s also announcing several new initiatives most likely aimed at increasing support for its upcoming acquisition of Sprint which some states are still fighting. 

T-Mobile’s 5G numbers sound impressive compared to the 5G rollouts we’ve seen so far, but that’s because it’s focusing on coverage rather than speed. Verizon has only rolled out 5G in a handful of cities, and coverage is middling even in areas with 5G antennas on every street corner. Meanwhile, AT&T’s network is smaller and it won’t even sell a 5G phone to consumers yet. 

T-Mobile is leapfrogging the larger carriers in coverage because it’s starting its 5G deployment on its existing 600MHz spectrum. Both AT&T and Verizon use super-high-frequency millimeter wave spectrum to deliver very fast data with very poor range — T-Mobile toyed with millimeter wave in a few cities, but that’s not the backbone of its “real” 5G network. T-Mobile’s 5G will cover much larger areas, but the peak speeds will be far below the 1-2Gbps possible with millimeter wave. 

The carrier also promises its recently approved acquisition of Sprint will allow it to provide free 5G service to first responders via its Connecting Heroes Initiative. It promises first responders will have access to this free 5G service for at least 10 years. T-Mobile also pledges to roll out reduced cost 5G service to 10 million low-income households (it calls this Project 10Million) over the next five years. 

A 5G millimeter wave cell site in Minneapolis on a light pole.

Finally, T-Mobile has a new entry-level plan called T-Mobile Connect. For $15 per month, you get unlimited talk, text, and 2GB of data. It includes 5G service, but 2GB of 5G data is, well, almost nothing. The included 5G is just a way to show off — this plan is only useful for people who are very light users. 

Of course, this all assumes there are no further issues with the Sprint deal. The FCC has given the green light for the merger, but several state Attorneys General still object to the deal. The carrier’s plan is to fully integrate Sprint’s 2.5GHz 5G spectrum into its network next year, giving it more 5G speed to go along with its expansive 600MHz coverage.

Now read:

November 7th 2019, 3:17 pm

Top Tech Trucks of the the North American Commercial Vehicle Show

ExtremeTech

NACV 2019 North American Commercial Vehicle show October 2019 Atlanta

Big trucks are big (obviously), sometimes noisy, get 6 mpg hauling 80,000 pounds), and occasionally cut you off  — just not as much as we cut them off. And they’re changing, to be more fuel-efficient, pollute less, and get the goods to far flung customers at lower cost. All that was on display this week at the North American Commercial Vehicle Show (NACV) in Atlanta.

Truckers are under the gun to pollute less, especially since most Class 6 (60,000 pounds of truck and cargo) to Class 8 (80,000 pounds max, truck, trailer and load) trucks run on diesel, which is seen as being tougher on the environment than gasoline engines. Most diesel trucks now use low-sulfur fuel and treat the exhaust with a urea formulation called diesel exhaust fluid, same as diesel pickup trucks and the few remaining diesel cars use. But changes are coming, as trucks shift from turbo-diesel big engines to hybrids to natural gas, battery electric, or hydrogen fuel cell electric.

Here’s our take on the best tracks of NACV.

Cummins X15, a traditional engine: turbo-diesel 6 cylinders, 400-605 hp, 1,850-2,050 lb-ft torque, 3,000 pounds.

The traditional powerplant for a big truck is a big engine such as the Cummins X15 turbo-diesel. The impetus for change in trucks comes from the West Coast, especially the Los Angeles basin, especially truck-centric places such as the Port of Long Beach. Ships are required to use shore power, or electric lines from onshore, when in port. The port’s Clean Trucks Program in 2012 banned older, high-polluting drayage trucks, the ones that make port to warehouse runs. As of October 2018, trucks registering to use the port had to be model year 2014 or new.

Outside the US, truckers are or will be restricted from entering large cities unless they meet stiff clean air and/or fuel economy standards. So it’s not just passenger cars under pressure to be cleaner, it’s also commercial trucks, and it’s also – in California – things like lawn mowers, leaf blowers, and charcoal barbecues with restrictions. Small as one lawn mower or one can of charcoal-starter fluid seems, the collective impact is significant in densely populated states. The two-stroke engine designs used by many products are ferociously bad for both the environment and your lungs. A single leaf-blower emits more particulate pollution than a car.

Tesla Semi: Introduced 2017, ship date 2019, now pushed to 2020.

The big no-show of the NACV show was the Tesla Semi battery-electric truck, with a claimed range of 300-500 miles depending on how much battery you buy. The Semi got a Hollywood-style launch in November 2017 and a prototype was spotted on a California street in early 2018. At the launch, Tesla talked about deliveries in 2019, and here we are seven weeks from 2020, with – surprise – no shipping Tesla Semis. Back in April 2019, Tesla reset the date to 2020.

Perhaps tellingly, there was significant action at NACV with hydrogen fuel-cell electric trucks, highlighted by the futuristic Hyundai HDC-6 Neptune. Despite infrastructure issues, truckers may be sensing that’s surmountable. What’s harder to resolve is what lithium-ion batteries weigh, possibly 20,000 of the truck’s 80,000-pound capacity. Plus motors. A diesel engine, transmission, drivetrain, and 250 gallons of clean-diesel fuel good for 2,000 miles weighs less half that.

This was the second biennial North American Commercial Vehicles show and it drew 470-plus exhibitors versus 439 at the 2017 show in Atlanta, organizers said.

Now Read:

November 7th 2019, 1:00 pm

Self-Driving Uber Car Involved in 2018 Crash Didn’t Understand Jaywalking

ExtremeTech

Volvo Cars and Uber join forces to develop autonomous driving cars

As we careen toward a future of self-driving cars, there are bound to be some bumps along the way. However, running down pedestrians is not what most would consider an acceptable growing pain for the technology. Uber faced serious questions in 2018 when one of its experimental autonomous vehicles killed a pedestrian during testing. A National Transportation Safety Board (NTSB) report claims that Uber’s self-driving technology simply wasn’t able to understand jaywalking pedestrians

Uber’s self-driving cars use a combination of lidar, radar, and cameras to map the world around the car and recognize obstacles. One of Uber’s specially outfitted Volvo XC90 SUVs was testing this system in Tempe, AZ on the evening of March 18, 2018 when 49-year-old Elaine Herzberg crossed in front of it. There was a safety driver in the vehicle at the time, but reports indicate they were not watching the road. Herzberg survived the initial collision and was treated at a local hospital but died of her injuries later that night. 

According to the NTSB investigation, Uber’s algorithms didn’t even realize that it might collide with Herzberg until 1.2 seconds prior to impact. Even if the car braked immediately, that would have been too late to avoid hitting Herzberg, but Uber’s software triggered a one-second braking delay so it could consider a possible steering correction. The NTSB also says Uber’s self-driving car system had no provisions for understanding a jaywalking person. It actually spotted Herzberg almost six seconds before the impact, but it didn’t do anything to avoid the collision because she wasn’t in a crosswalk — it just didn’t understand jaywalking was a thing. 

The report cites two other instances where Uber’s vehicles may have failed to understand roadway hazards. Inone case, the vehicle ran into a bicycle lane post that had been bent into the roadway by a previous impact. In the other, a safety driver had to take control to steer away from an oncoming vehicle, colliding with a parked car. 

Uber paused its autonomous car testing after the fatal accident, resuming it in December 2018 with redesigned software. The company claims that its new algorithms would have detected Herzberg 289 feet before impact and braked four seconds before impact. That would have been enough for the car to stop in time. The NTSB will meet on November 19th to determine an official cause of the accident. While Uber has been cleared of liability for the crash, police are still considering charges against the safety driver.

Now Read:

November 7th 2019, 10:10 am

MLPerf Releases First Results From AI Inferencing Benchmark

ExtremeTech

AI is everywhere these days. SoC vendors are falling over themselves to bake these capabilities into their products. From Intel and Nvidia at the top of the market to Qualcomm, Google, and Tesla, everyone is talking about building new chips to handle various workloads related to artificial intelligence and machine learning.

While these companies have shown their own products racking up impressive scores in various tests, there’s a need for tools that independent third parties can use to compare and evaluate chips. MLPerf is a joint effort between dozens of developers, academics, and interested individuals from a number of companies and organizations involved in various aspects of AI/DL. The goal of the project is to create a test framework that can be used to evaluate a huge range of potential products and use-cases.

In this case, we’re discussing inferencing, not training. Inferencing is the application of a model to a task. Training refers to creating the model in the first place. Models are typically trained on high-end PCs, servers, or the equivalent of an HPC cluster, while inferencing workloads can run on anything from a cell phone to a high-end server.

According to David Kanter, who co-chairs development of the inference benchmark, MLPerf’s design team has settled on four key scenarios that they evaluate. Edge devices, like cell phones, focus on reading data from one stream at a time. These devices emphasize low latency and are benchmarked against those criteria. The next class of products read multiple streams of data at once, like a Tesla with its eight separate cameras. In these cases, the ability of the system to handle all of the data streams in question within acceptable latencies becomes important.

On the backend, there’s the question of whether or not the server can maintain an adequate number of queries per second within a defined response envelope, while the “Offline” test is intended for tasks like photo sorting, that don’t have a time dimension attached to them. These four tasks conceptually bracket the areas where inferencing is expected to be used.

MLPerf will allow users to submit benchmark results in both a closed and an open division. The closed division will have stricter rules around its submissions and tests, while the open division will allow for more experimentation and customization. One small feature I particularly like is how MLPerf results are categorized. There are three categories: Available, Preview, and RDO (Research, Development, Other). Available means you can buy hardware today, Preview means it’ll be available within 180 days or before the next release of MLPerf, and RDO is for prototypes and systems not intended for general production. This type of distinction will allow users to understand which hardware platforms should be compared.


So far, the program has seen a wide range of submissions from various CPUs, GPUs, ASICs, FPGAs, and even a DSP.

The performance range of various solutions in the Closed performance division. The fact that these results are up to 10,000x different was quite a challenge. Kanter told us this made the inferencing benchmark particularly tricky — MLPerf needed to devise tests that could scale over four orders of magnitude. My colleague David Cardinal has referred to some  of the specific Nvidia comparisons, but I wanted to highlight the larger release of the information set.

To give you an idea how difficult this is: When Maxon released a new version of Cinebench, they did so partly because it took so little time to render the old Cinebench scene, it was no longer a meaningful CPU test. The new CB20 test takes more time to render than CB15 did. But this makes it quite annoying to run the single-core version of the test, which now takes significantly longer. That longer delay is irritating when running a test from, say, 1-16 cores. Trying to build a benchmark that can complete in an acceptable amount of time on chips with a vastly larger gap is a difficult prospect.

I’m excited to see products like MLPerf under development and I’m looking forward to working with the project when it’s ready for public release. MLPerf has a large development team from around the globe and a blend of academic and commercial support. It’s not an effort controlled by any single company and it wasn’t designed to favor any single company’s products. As AI and ML-enabled hardware filters to market, we’re going to need tests that can evaluate its capabilities. So far, MLPerf looks like one of the best.

Now Read:

November 7th 2019, 7:51 am

Court Allows Police Full Access to Online Genealogy Database

ExtremeTech

Genetic testing has become sufficiently cheap and fast that companies like 23andMe and Ancestry have sprung up to offer consumer genetic testing services. That means millions of people have copies of their genetic profiles sitting on a server someplace for the first time, and police have taken notice. A Florida court has set a potentially troubling precedent by allowing police to access one of these online databases in full, even if users opted out of law enforcement searches. 

The warrant, signed by Florida’s Ninth Judicial Circuit Court in July, granted Orlando police full access to the genetic profiles stored on GEDmatch. You might remember that site from when police in California used it to track down the Golden State Killer in 2018. Police departments around the country began looking at ways to use online genetic databases to find new leaks in cold cases, but GEDmatch faced intense criticism from users who didn’t know police could snoop their DNA. Being part of police investigations wasn’t part of the deal. 

GEDmatch is a bit different than companies like 23andMe in that it doesn’t provide DNA testing services. Instead, the site allows people to upload the genetic results they get from other companies to search for relatives among its user base. GEDmatch attempted to shield users from police sweeps following the Golden State Killer case by requiring officers to identify themselves on the site and only use it for serious criminal investigations. Users also had to opt-in to law enforcement access, which only 185,000 of the site’s 1.3 million users did. 

Orlando Police Department Detective Michael Fields talked about the first of its kind warrant at the recent International Association of Chiefs of Police conference in Chicago. The warrant allowed the department complete access to GEDmatch’s genealogy database, regardless of whether or not users opted into police investigations. The site complied with the warrant within 24 hours. Fields says the search hasn’t yielded any new arrests, but the department has gotten some potential leads. 

Privacy advocates now worry that the Florida court’s decision could encourage departments to go after larger testing sites like 23andMe, which has 10 million users, and Ancestry with its 15 million users. Whatever happens, you should keep in mind what might happen to your DNA if you put it online.

Now read:

November 6th 2019, 5:05 pm

Review Roundup: ARM-Powered Surface Pro X Nails Form Factor But Offers Weak Performance

ExtremeTech

Microsoft’s Surface Pro family has been due for a redesign and the Surface Pro X delivers one. Unfortunately, the Pro X lags behind other tablets already on the market, held back by its use of an ARM chip as a primary SoC, the Qualcomm-powered SQ1. That’s the general conclusion from three reviews: CNET, Endgadget, and The Verge.

High Marks for Beauty

Everyone loves the physical design of this hybrid machine. CNET calls the redesign “spot on.” The Verge waxes poetic, writing: “It is a beautiful, well-made hybrid tablet device that looks better than any other computer I’ve tried in at least the past year.” Engadget says “The Surface Pro X is a beautiful piece of hardware and the best Snapdragon-powered PC around.” Keep that last comment in mind regarding the long-term viability of ARM in Windows as we walk through the review.

It’s fanless and while there’s no headphone jack, Bluetooth audio reportedly works well. Connectivity is limited to two USB-C ports, a microSD port, and a Surface Connect port. There’s no Thunderbolt 3 (not surprising in an ARM device), but either USB-C port can be used for charging. The basic Surface keyboard is $139, while the new Surface “Signature Keyboard” costs $270. If you buy the Signature Keyboard you get the new Surface Slim Pen in the box, otherwise it’s a $144 add-on.

The look, feel, and design of the Surface Pro X are clearly quite popular. It’s the other aspects of the system that fail to impress reviewers.

Weak Performance, Poor Compatibility

The Surface Pro X’s performance and software limitations make it clear that any dream of ARM replacing x86 in Windows PCs is still a long way off. Compatibility is limited to 32-bit apps and most games and benchmarks won’t run on the system. Engadget reports encountering a significant number of bugs and crashes. The core of Windows runs just fine on the Surface Pro X, but performance doesn’t match Intel systems at the same price point according to the Verge. The site writes:

“More intense apps like Photoshop do technically run on this computer, but they’re so slow they may as well not. Microsoft says that Adobe is committed to creating 64-bit ARM versions of its Creative Cloud apps, but there’s no timeline for when that’ll happen.”

Most games you’d find on Steam or the Epic Game Store are incompatible. 64-bit x86 apps will not run in emulation and apparently even the Windows Store has a habit of recommending apps for the Surface Pro X that aren’t actually compatible for it. The Verge also notes that their reporters bought apps to test, only to discover that despite being advertised as compatible, the apps don’t actually run.

That’s something of a problem, seeing as we’re talking about a product with limited app compatibility in the first place. The price starts to bite Surface Pro X here, because this is a system that starts at $999. The keyboard + stylus combo is a further $270 on top of that. Intel-powered Surface systems start at $749 and offer much higher performance and no app compatibility issues, writes CNET. There are virtually no published benchmark results because it’s so hard to find benchmarks in the first place. Best-case performance in web browsing seems to be “Perfectly acceptable, though not as fast as Intel.”

Battery Life

Battery life claims are all over the map. CNET claims 8 hr, 50 minutes, the Verge says 5-6 hours of active use, which they figure is 9-10 hours by MS metrics. Engadget came the closest to hitting Microsoft’s claimed battery life of 13 hours, with a measured runtime of 11 hours, 45 minutes. That’s a fairly significant gap between the tested systems, but nobody is seeing the massive 20-hour battery runtimes that were originally promised for Windows systems on ARM. Hopefully benchmark compatibility will improve in the future and we’ll be able to describe system performance in more detail.

Ultimately, the Surface Pro X appears to be a beautiful system no reviewer is happy with. Everyone praises the design, keyboard, and LTE modem. Battery life is acceptable, if perhaps a little low compared to what people hoped for. But the performance, bugs, and app compatibility issues are all problems. CNET writes that the system is ideal “only for a small segment of the tablet-toting population.” The Verge writes: “This is a CEO’s computer, not an engineer’s computer, and certainly not a computer for the rest of us.” The most positive review is from Engadget, who writes: “If you’re in the sliver of the population that needs access to a small handful of Windows apps, then maybe the Pro X is sufficient. But bear in mind you’ll be paying a hefty premium for Windows, an LTE connection and gorgeous hardware.”

Now Read:

November 6th 2019, 3:34 pm

Nvidia’s new Jetson Xavier NX Adds Horsepower to AI at the Edge

ExtremeTech

Whether Nvidia’s bold claim that its new Xavier NX ($399) is “the world’s smallest supercomputer for AI at the Edge” is true or not, there is no doubt it will be an impressive accomplishment when it ships in March. Packing similar power to the current Xavier into the tiny form factor of a Jetson Nano, it will greatly multiply the amount of inferencing and training that can be performed in small, low-power, devices.

Nvidia expects the new module to find a home in a wide variety of AI and vision-centric applications, including autonomous machines, high-resolution sensors for robots, video analytics, and a variety of embedded devices. Basically, Xavier NX should be a solid upgrade for any application wanting to move up from a Nano, or shrink its form factor from a full-sized Xavier.

Nvidia’s Jetson Xavier NX By the Numbers

Featuring an Nvidia Volta with 384 CUDA cores, along with 48 tensor cores, and 2 Nvidia Deep Learning Accelerators, the Xavier NX is capable of 21 TOPS (INT8) or 6 TFLOPS (FP16) of AI performance while consuming only 15 watts of power. When limited to 10 watts, it can still perform at 14 TOPS. That compares to the .5 TFLOPS (FP16) of the similarly-sized Nano ($129) and almost matches the specs of the much larger Jetson AGX Xavier ($599).

Nvidia’s Jetson family showing relative size, price, and specs

The Jetson Xavier NX is pin compatible with the current Jetson Nano, which should mean that current embedded devices will quickly be able to benefit from an order of magnitude increase in performance in essentially the same power and size envelope.

As its main processor, the Xavier NX features a 6-core Carmel Arm 64-bit CPU, with 6MB of L2 and 4MB of L3 cache.  It can directly support up to six CSI cameras over 12 MIPI CSI-2 lanes. It comes equipped with 8GB of 128-bit LPDDR4x RAM, capable of 51.2GB/second data transfer. The 70mm x 45mm module also features support for Gigabit Ethernet and like other products in the Jetson family runs an Ubuntu-based Linux.

Xavier NX Poised to be a Benchmark Leader in Edge-based Inferencing

In related news, Nvidia announced today that its current Xavier SoC was the top edge computing performer in the newly-released MLPerf Inferencing .5 benchmark. As a low-power version based on the same Xavier design, it should also prove to be an even more competitive entrant in the battle for edge computing design wins.

Xavier NX packs most of the power of a Xavier into a Nano form factor with more performance than a TX2

Xavier NX Fits Into Nvidia’s AI Ecosystem

One of the strongest points in favor of using Nvidia’s products for AI development and applications is the company’s extensive AI ecosystem, and how well it scales up and down across its product line. The Xavier NX should be no exception. It is supported by Nvidia’s popular JetPack SDK, and runs on the same CUDA-X AI software architecture as existing devices. Developers who want to get a head start on developing for the Xavier NX can work with the Jetson AGX Xavier Development Kit after applying a patch to emulate the Xavier NX.

Showcasing the broad adoption of its Jetson platform, Nvidia says there are over 400,000 registered developers using Jetson, and over 3,000 customers using them for applications. Some of the examples the company provided included PostMates Delivery Robot, Skydio’s new drone, and an AI Microscope developed by Stanford University. Speaking from personal experience developing on and for a Jetson Nano and a JetBot, the tools, platform, and support are excellent, so there is every reason to believe that Xavier NX will be a big hit with those looking to push AI at the Edge to the next level.

Now Read:

 

November 6th 2019, 2:47 pm

ET Deals: Save on Apple iPad and iPad Pro, $300 off XPS 8930

ExtremeTech

If you want to buy a new high-end tablet then today is your lucky day. Apple’s 2018 iPads are on sale from Walmart with significant discounts that will save you hundreds of dollars.

Apple iPad Pro 10.5-Inch 512GB Wi-Fi Tablet ($699.00)

Apple built this version of its iPad Pro with its in-house designed A10X processor that offers excellent performance. This model also has a large 512GB storage capacity, which will let you store a large amount of MP3s and videos. Right now it’s marked down from $999.00 to $699.00 on Amazon.

Apple iPad 6th Gen 128GB Wi-Fi ($299.00)

Measuring just 9.7-inches diagonally with a LED backlit IPS display, this is one of Apple’s smallest tablets currently on the market. It also features a fast A10 64-bit processor and can last up to 10 hours on a single charge. You can get it now from Walmart marked down from $429.99 to $299.00.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

 

November 6th 2019, 2:47 pm

Scammers Using Firefox Bug to Lock Down Browsers

ExtremeTech

Some of the most effective online attacks rely on social engineering as much as clever coding. Scammers have started exploiting a bug in Firefox that causes the browser to lock up, pushing users to call a phone number for support. Mozilla is reportedly working on a fix for this issue, but users with less technical knowledge may be unable to get rid of the locked page even after restarting the browser. 

This attack is, at its heart, a classic tech support scam. The new wrinkle here is the scam page uses the Firefox bug to make it impossible to ignore. When a user lands on the boobytrapped page while using Firefox, the browser shows a login box that cannot be dismissed. Attempting to close the browser also doesn’t work. 

Below the popup, the fraudsters display a message in broken English that reads: 

Please stop and do not close the PC… The registry key of your computer is locked. Why did we block your computer? The Windows registry key is illegal. The Windows desktop is using pirated software. The Window desktop sends viruses over the Internet. This Windows desktop is hacked. We block this computer for your safety.

If the victim calls the phone number on the page, they’ll connect with a person claiming to be with Microsoft tech support. However, that’s a scammer who will attempt to trick them into paying money for some product or service they don’t need — in this case, a non-existent Windows license. The only way to close the browser once it’s been locked is to end the process (both Windows and macOS). If you have the misfortune of running Firefox with tab restore enabled, the offending page will just come back the next time you open the browser. In that case, you’d need to disconnect from the network or reset the browser preferences to free yourself. 

Unfortunately, many people won’t have enough technical knowledge to clear a browser lock like this, and the odd behavior will convince more users that there is something genuinely wrong, and maybe they ought to call that mysterious number. 

Mozilla developers have stated they’re working on a fix that will rollout soon. For now, it’s a good idea to disable tab restore to speed recovery in case you run into a page exploiting this bug.

Now read:

November 6th 2019, 12:34 pm

Google Ends Regular Updates for 2016 Pixel Phones, Promises One Last OTA in December

ExtremeTech

Google Pixel

Google sold Nexus phones for years, but they were always geared toward keeping the price low. The Pixel and Pixel XL are the first uncompromising flagship phones Google has made. This phone comes in two sizes -- a 5-inch model with a 2770mAh battery and a 5.5-inch version with a 3440mAh battery. Both have AMOLED displays, with the smaller at 1080p and the larger one at 1440p. They are otherwise the same with solid aluminum frames, highly accurate rear-facing fingerprint sensors, and the best overall camera on any Android phone. The hardware design isn't flashy, but it's very solid. Sadly, it lacks water-resistance.

Google launched the Pixel with Android 7.0 Nougat and since updated it to 7.1. The Pixel will always have the latest version of Android for at least two years, and will get monthly security patches for at least another year after that. In addition, it has exclusive software features that not even Nexus phones have including Google Assistant voice control and the new Pixel Launcher home screen. Google has tuned this phone to be lightning fast and have better battery life than any other phone in its class. It's the best Android phone released in 2016.

One of the primary selling points for Google’s Pixel phones is that they get monthly security updates and new Android versions for three years. The original Pixel phones launched in 2016, and that means they’ve hit their end of life. Google released new monthly patches for its phones yesterday, and indeed, the 2016 Pixels were left out. However, Google will offer owners of those devices one final OTA as a farewell. 

The launch of the Pixel and Pixel XL in 2016 marked a major shift for Google’s phone strategy. In the past, it focused on promoting new versions of Android with a powerful but low-cost piece of “Nexus”  hardware manufactured by an established smartphone OEM. Over the years, it partnered with HTC, Samsung, LG, Asus, Motorola, and Huawei to make Nexus devices. The 2016 Pixels might have been manufactured by HTC, but you’d never know from looking at them. These were the first truly Google-branded smartphones. 

When it launched the original Pixels, Google only promised two years of major OS updates. More recent phones are guaranteed three years of support for the OS version and security updates. So, Google has already gone above and beyond by offering an update to Android 10 on the 2016 Pixels. However, the monthly updates are arguably more important as they keep devices safe from new online threats that cannot be patched in other ways. 

Some owners of the original Pixel have expressed annoyance that Google would update the devices to the new OS and then immediately end support. After all, any major OS update is bound to have some bugs. Well, there’s still some hope that Google will leave these devices in a good place. The company will push one more update to the phones in December, hopefully giving it a chance to clean up the rough edges in Android 10. After that, Google promises no further update support. That’s a bummer, but the Pixels have already gotten much more attention than other Android phones. 

Going forward, users will be on their own if the want to stick with their aging smartphones. Google would probably prefer everyone upgraded to new devices, but the non-Verizon Pixels have unlockable bootloaders. You can choose to flash custom software line Lineage OS on the 2016 Pixels to eke a little more life out of them.

Now read:

November 6th 2019, 10:42 am

Save 96 Percent On This Complete Linux System Administrator Bundle

ExtremeTech

Ever wanted to become a Linux pro and you just weren’t sure if you had the time to devote to such a task? With The Complete Linux System Administration Bundle, you’ll have access to everything you need to become a pro with all things Linux on your own time. From installation to administration, everything included will make you a pro and keep you up to date.

This bundle comes packed with over 118 hours of content and training within its 7 courses, so you’ll be able to learn a bit of everything Linux related. You’ll learn concepts like how to administrate Red Hat Linux, learn app development by coding in a virtual machine, and even the fundamentals of Unix and Linux. The great part about this bundle is that you’ll be able to go at your own pace and do things as you wish. With all 7 courses and their content, you’ll have lifetime access, so you can truly take your time.

Start learning to become a Linux pro in your spare time thanks to The Complete Linux System Administration Bundle while it’s on sale. Right now, you can grab this whole bundle for $69, saving you 96 percent off the original price, so stop waiting to become a Linux pro.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 6th 2019, 10:12 am

You Can Now Request Some of the Secret Data Companies Compile About Your Online Activities

ExtremeTech

It is not a secret that US companies assemble towering mountains of data on their customers. Over the years that we’ve covered the intersection of privacy and the internet, it’s been openly acknowledged that there are vast data brokers operating behind the scenes, selling information about us to other companies that want the data. But the very nature of these firms is that they operate in silence and secrecy. Companies historically haven’t wanted to admit that they collect data, and if they do collect data, they don’t want to acknowledge just how much.

The New York Times, it seems, has had some luck prying open the data vaults and discovering exactly what’s going on behind the curtain. Each of us have “secret scores,” — scores being used to calculate everything from how likely you are to return a product to whether or not you should be allowed to borrow money. Here’s the NYT. The author requested a copy of her own data file from one such data broker, a company named Swift. The author requested a data dump on herself and received the following:

More than 400 pages long, it contained all the messages I’d ever sent to hosts on Airbnb; years of Yelp delivery orders; a log of every time I’d opened the Coinbase app on my iPhone. Many entries included detailed information about the device I used to do these things, including my IP address at the time.

Sift knew, for example, that I’d used my iPhone to order chicken tikka masala, vegetable samosas and garlic naan on a Saturday night in April three years ago. It knew I used my Apple laptop to sign into Coinbase in January 2017 to change my password. Sift knew about a nightmare Thanksgiving I had in California’s wine country, as captured in my messages to the Airbnb host of a rental called “Cloud 9.”

Companies have begun to offer customers a look at their own data records after the EU passed the GPDR and California passed its own Consumer Privacy Act. In June, the Consumer Education Foundation also asked the FTC to investigate the use of these shadow-scores that impact how people are allowed to shop and what offers they receive in ways people generally are not aware of. One point that the article makes is that simply having these voluminous reports is insufficient — we know absolutely nothing about the algorithms and data analytics being used to track our own behavior. It’s not clear if these companies have even found useful ways to measure the traits they claim to measure, while the damage caused by being misclassified by algorithms can impact everything from the interest rates you’re offered to the customer service you receive. The problem of bias in algorithms is no longer a theory; corporations like Amazon and Google have acknowledged the current limits of these technologies when forced to do so. That hasn’t stopped companies from rushing to implement these ideas, however.

We’re at the point where companies are starting to show what data they use to make decisions, but not how they weight or use it. But those things matter. If we’re going to invent HumanMark as a society, we humans who will be rated by it ought to have some say in how those ratings take place and how companies are allowed to use the information.

The NYT article steps through the specific process for requesting your own personal data file from Sift (consumer trustworthiness), Zeta Global (identifies high rollers with money to burn), Retail Equation (helps companies decide whether or not to accept a return), Riskified (develops fraud scores), and Kustomer, which promises “unprecedented insights into a customer’s past experiences and current sentiment.” It also notes that just because companies promise to provide data doesn’t mean they actually will, using the example of Kustomer, which has apparently been rather hard to reach.

Now Read:

November 6th 2019, 8:08 am

AT&T Rewarded With $60M Fine After Lying to Customers About Mobile Data Throttling

ExtremeTech

The FTC has announced a resolution in a case we first covered back in 2014. At the time, we reported that AT&T was throttling end users who used over 22GB of data, meaning they misadvertised the common meaning of “Unlimited.” The problem was actually much more severe. The FTC found that AT&T was throttling some users after they used as little as 2GB of data without notifying them that it had done so. By October 2014, more than 3.5 million AT&T customers had been affected by these practices. From 2011 – 2014, those 3.5M customers were throttled more than 25 million times. Throttling was applied on a monthly basis and, according to the 2014 suite, reduced network performance by 80-90 percent on affected customers, making it impossible to use devices for common tasks.

While it’s good to see the FTC taking action against AT&T for this kind of problem, the size of the fine is small compared to the revenue AT&T would have earned off those customers during the years it took their money and provided an abysmal service experience. In 2012, the average cell phone bill for an AT&T customer was $80 per month. We know that AT&T applied throttling in terms of months. If it applied throttling more than 25 million times, it earned $2B in revenue from a customer group that was receiving exceptionally poor service.

I’m one of the AT&T customers that was caught by this issue.

AT&T’s operating margin in 2012 was 10.2 percent. Operating margin is the profit a company earns before paying interest or taxes, but after accounting for all of the variable and fixed costs of sales. If we apply the 10.2 percent margin to the $2B in revenue AT&T earned off throttling customers after lying to them about the terms and conditions of their plans, AT&T earned $204M in profit off its throttling scheme. The government is “punishing” AT&T by only allowing them to keep two-thirds of it.

These numbers are simplistic, but they make the point. AT&T is a company that sells cellular service. In 2011, AT&T began lying to customers about the product it sold. It choked the performance customers saw from the cellular service they paid for. Slowing network performance by 80-90 percent is how you prevent people from using a service that they technically pay for, and people were clearly angry — hundreds of thousands of complaints poured into the FTC. This was not an accident, or an oversight, or a mistake. This was a deliberate and calculated attempt to screw people out of service they had paid for.

This was during the same period of time when people were converting to smartphones and changing over to data-heavy plans in the first place, and it was part of a scheme to push early adopters off of unlimited plans and to force them to start paying for data at prices AT&T found more appealing. This was the same era when companies were pushing $10/GB plan pricing and beginning to “encourage” customers to step away from unlimited data in the first place. In short, it was a bad faith practice from start to finish. A statement from FTC Commissioner Rohit Chopra dives into more detail on the findings of fact, including the fact that AT&T simultaneously locked people into unlimited plans via grandfathering (to keep from losing subscribers to Verizon) while simultaneously advertising new unlimited plans that were not remotely unlimited. The explicit goal was to force customers to sign up for per-GB plans with high overage fees. Chopra criticizes aspects of how the FTC has dealt with this case, including its willingness to give big businesses the benefit of the doubt, the limited amount of compensation to affected customers, and the need to perform better economic analysis to approximate losses to customers.

Based on the public facts of the case, it seems extremely likely that AT&T made far more than $60M in profit selling unlimited wireless service to customers that they then couldn’t use. Definitionally, it would have made far more profit off those customers than anyone else. Tying wireless network pricing to data consumption as opposed to time-of-access and total bandwidth demanded is a bass-ackwards model in the first place, but if you sell people a product they are then prevented from using, they’re obviously not consuming very much of it. It’s the cellular network equivalent of patent trolling, only instead of making money from worthless IP no one should have to license, AT&T was making money off building a wireless network it wouldn’t let its own customers use.

I’m not claiming my simplistic math is literally accurate, because it probably isn’t. But the point stands. The median amount of revenue AT&T would have expected to earn off an average customer with an $80 bill would be a bit over $8 per month, based on its operating margin. Having throttled 3.5M accounts over 25M times, that’s $204M in revenue. The fine is a fraction of the profit AT&T earned from this practice, not counting the money it saved by not having to build out its network to actually handle peak traffic. The company has been effectively rewarded for its practice, and the $60M fine is simply a cost of doing business.

The $60M will be deposited into a fund for users who were throttled between 2011 – 2014. Affected customers will receive a  bill credit and will not have to apply. The vote in favor of the resolution was 4-0-1 with one recusal. As part of the settlement, AT&T is prohibited from advertising services as “Unlimited” without prominently stating the restrictions those services carry.

Now Read:

November 5th 2019, 4:13 pm

ET Deals: 15 Percent off $50 Apple iTunes Gift Card, Dell 34-inch UltraSharp 3440×1440 Curved Monito

ExtremeTech

If you love Apple then you won’t want to miss today’s best deal. For a limited time you can get a $50 Apple iTunes gift card for a reduced price.

Apple iTunes $50 Gift Card ($42.50)

For a limited time you can get an Apple iTunes gift card for just $42.50 from Amazon with promo code ITUNES.

Dell UltraSharp U3415 34-Inch Curved 3,440×1,440 Monitor ($579.99)

Dell’s high-end UltraSharp U3415 display features an ultra-wide IPS curved display panel with a resolution of 3,440×1,440. The display also has excellent color support that extends to cover 91 percent of the CIE1976 color gamut, and it’s currently marked down from $799.99 to just $579.99 from Amazon.

Featured Deals

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

 

November 5th 2019, 3:13 pm

NASA Confirms Voyager 2 Has Left the Solar System

ExtremeTech

Image Credit: NASA / JPL Tech

Humanity first left the solar system in 2012 when the Voyager 1 probe passed into interstellar space decades after leaving the planets behind. Now, there’s a second spacecraft beyond the limits of our solar system: Voyager 2. Luckily, Voyager 2’s instruments are in somewhat better shape than Voyager 1’s, so scientists were able to observe the transition from the heliosphere, which is dominated by the sun, to the interstellar medium (ISM). 

Both Voyager probes launched in 1977, with Voyager 2 heading into space a few weeks before Voyager 1. The two probes are physically identical, but they took different paths through the solar system. They took advantage of the “Grand Tour,” an alignment of the planets that occurs only once every 175 years. Voyager 1 visited and got gravity assists from Jupiter and Saturn before heading off toward the edge of the solar system. Voyager 2 swung past Jupter, Saturn, Neptune, and Uranus. It made its last planetary observation of Uranus in 1989, almost a decade after Voyager 1 had started its long march toward the edge of the solar system.

When Voyager 1 reached the edge of our solar system, known as the heliopause, it no longer had a functional plasma spectrometer. As a result, there was some debate about when, exactly, the probe left our solar system. So, we missed the expected transition from warm solar plasma to the denser cold plasma of the ISM. Eventually, measurements of local electrons and magnetic field shifts confirmed it was in interstellar space. 

Voyager 2 has just sent back data proving that it has also crossed the heliopause, and it had a fully functional plasma spectrometer. The transition happened about a year ago in November 2018, and the changeover was roughly in-line with what scientists expected based on Voyager 1’s indirect readings. As Voyager 2 crossed from the heliosphere to the ISM, it detected a 20-fold increase in plasma density. 

The approximate positions of Voyager 1 and 2.

Voyager 1 and 2 crossed the heliopause at roughly the same distance from the sun, 121.6 AU and 119 AU, respectively. However, their exit points were about 150 AU apart. Scientists are studying the discrepancies in the data in hopes of gaining a better understanding of the boundary between our solar system and the wider galaxy. For example, Voyager 2 detected a continuous change in magnetic field directions as it crossed into the ISM, whereas Voyager 1 did not. Voyager 2 has also continued to see low-energy particles from the sun in the ISM, but Voyager 1 didn’t. 

It will be some time before we have more data to study. The only functional probe that has any hope of reaching the heliopause is New Horizons, which is currently flying through the Kuiper Belt. It could leave the solar system around 2040, but we don’t know if it will maintain communication with Earth that long.

Now read:

November 5th 2019, 1:56 pm

Overclocking Results Show We’re Hitting the Fundamental Limits of Silicon

ExtremeTech

Silicon Lottery, a website that specializes in selling overclocked Intel and AMD parts, has some 9900KS chips available for sale. The company is offering a 9900KS verified at 5.1GHz for $749 and a 9900KS verified at 5.2GHz for $1199. What’s more interesting to us are the number of chips that qualify at each frequency. 31 percent of Intel 9900KS chips can hit 5.1GHz, while just 3 percent can hit 5.2GHz. The 5.2GHz option was available earlier on 11/4 but is listed as sold-out as of this writing.

The 9900KS is an optimized variant of Intel’s 9900K. The 9900K is Intel’s current top-end CPU. Given the difficulties Intel has had moving to 10nm and the company’s need to maintain competitive standing against a newly-resurgent AMD, it’s safe to assume that Intel has optimized its 14nm++ process to within an inch of its life. The fact that Intel can ship a chip within ~4 percent of its apparent maximum clock in sufficient volume to launch it at all says good things about the company’s quality control and the state of its 14nm process line.

What I find interesting about the Silicon Lottery results is what they say about the overall state of clock rates in high performance desktop microprocessors. AMD is scarcely having an easier time of it. While new AGESA releases have improved overall clocking on 7nm chips, AMD’s engineers told us they were surprised to see clock improvements on the Ryzen 7 3000 family at all, because of the expected characteristics of the 7nm node.

AMD and Intel have continued to refine the clocking and thermal management systems they use and to squeeze more headroom out of silicon that they weren’t previously monetizing, but one of the results of this has been the gradual loss of high-end overclocking. Intel’s 10nm process is now in full production, giving us some idea of the trajectory of the node. Clocks on mobile parts have come down sharply compared to 14nm++. IPC improvements helped compensate for the loss in performance, but Intel still pushed TDPs up to 25W in some of the mobile CPU comparisons it did.

I think we can generally expect Intel to improve 10nm clocks with 10nm+ and 10nm++ when those nodes are ready. Similarly, AMD may be able to leverage TSMC’s 7nm EUV node for some small frequency gains itself. It’s even possible that both Intel and TSMC will clear away problems currently limiting them from hitting slightly higher CPU clocks. Intel’s 10nm has had severe growing pains and TSMC has never built big-core x86 processors like the Ryzen and Epyc chips it’s now shipping. I’m not trying to imply that CPU clocks have literally peaked at 5GHz and will never, ever improve. But the scope for gains past 5GHz looks limited indeed.

Power per unit area versus throughput (that is, number of 32-bit ALU operations per unit time and unit area, in units of tera-integer operations per second; TIOPS) for CMOS and beyond-CMOS devices. The constraint of a power density not higher than 10 W cm2
is implemented, when necessary, by inserting an empty area into the optimally laid out circuits. Caption from the original Intel paper.

The advent of machine learning, AI, and the IoT have collectively ensured that the broader computer industry will feel no pain from these shifts, but those of us who prized clock speed and single-threaded performance may have to find other aspects of computing to focus on long-term. The one architecture I’ve seen proposed as a replacement for CMOS is a spintronics approach Intel is researching. MESO — that’s the name of the new architecture — could open up new options as far as compute power density and efficiency. Both of those are critical goals in their own right, but so far, what we know about MESO suggests it would be more useful for low-power computing as opposed to pushing the high-power envelope, though it may have some utility in this respect in time. One of the frustrating things about being a high performance computing fan these days is how few options for improving single-thread seem to exist.

This might seem a bit churlish to write in 2019. After all, we’ve seen more movement in the CPU market in the past 2.5 years, since AMD launched Ryzen, than in the previous six. Both AMD and Intel have made major changes to their product families and introduced new CPUs with higher performance and faster clocks. Density improvements at future nodes ensure both companies will be able to introduce CPUs with more cores than previous models, should they choose to do so. Will they be able to keep cranking the clocks up? That’s a very different question. The evidence thus far is not encouraging.

Now Read:

November 5th 2019, 9:39 am

Researchers Hack Smart Speakers with Lasers

ExtremeTech

Consumers are increasingly outfitting their homes with smart speakers and displays that use wide-field microphones to pick up voice commands. These devices can be incredibly convenient, but you have to wonder about the privacy implications. That’s not just because the companies hosting your data might do something untoward — it may be possible to exploit these devices to steal data or access connected smart home platforms. Researchers from UEC Tokyo and the University of Michigan have demonstrated one approach to hacking smart speakers using lasers

Many of today’s smart home devices, smartphones, and tablets use compact and highly sensitive MEMS microphones. Because of the nature of the MEMS components, they respond to light as if it was sound. The researchers found they could direct a laser at smart speakers to take advantage of that, an attack they call Light Command.

Smart speakers like Google Home (Nest), Apple HomePod, and Amazon Echo are constantly listening using local audio processing, but they only “wake up” when someone says the trigger phrase. For example, that’s “Hey Google” or “OK Google” for Google Nest. The key to the attack developed by the international team is that the speakers don’t much care who says the trigger phrase. While some features require voice authentication, most don’t need to make sure you are the account owner to start issuing commands. 

In order to trick smart speakers, simply shined a laser directly at the microphone from up to 360 feet (110m) away. From that distance, you need a very powerful laser, but up close, a standard laser pointer has enough power. By modulating the intensity of the light, the team can match the signal of their chosen voice command.  You can see the technique working in the above video. 

While this is a troubling development for fans of smart home technology, Light Command isn’t going to make all your smart speakers and displays easily hackable. For one, the laser needs to hit the microphone at just the right angle, and the mics are rarely in a convenient location. For example, the Google Home has mics facing up and slightly back. Users will also see the light reflecting off the speaker, and hear the audio feedback when the speaker activates. Still, We wouldn’t be surprised if companies start adding hoods or filters of some sort to make the mics harder to target with lasers.

Now read:

November 5th 2019, 7:38 am

Samsung Confirms Layoffs, Cancels Custom Chip Development

ExtremeTech

Last month, we covered rumors that Samsung’s custom chip department and R&D department in Austin, TX was being targeted for layoffs following the disappointing performance of Samsung’s own M3 and M4 CPU cores. That news has now been confirmed in a letter Samsung provided to the Texas Workforce Commission (by law, Texas is required to give notice).

The letter Samsung sent can be viewed here. It states that Samsung will close its CPU project at SARC and that the cancellation is expected to be permanent. The number of affected employees is unknown, but there are supposedly ~300 people working on the CPU dev team at SARC. It is unknown if any of those employees will transfer to other teams or be reassigned.

Samsung launched SARC in 2010 and made building its own custom architecture a major prong of its long-term plans for ARM architectures. Developing one’s own CPUs is considerably more expensive than simply licensing ARM cores, and Samsung put in a major effort on these chips, shepherding the family through multiple products and strategies. In a statement to Android Authority, Samsung said:

“Based upon a thorough assessment of our System LSI [large scale integration – ed] business and the need to stay competitive in the global market, Samsung has decided to transition part of our U.S.-based R&D teams in Austin and San Jose.”

There will still be an M5 architecture to come, given the long lead times of these projects, but it seems unlikely to present much of an improvement of the M4 if Samsung is pulling the plug on the entire endeavor. The M6 was supposedly an SMT architecture, though it’s not clear why Samsung went this route for mobile. Apart from Intel with its ill-fated Medfield, no one else has built an SMT-capable chip for a mobile CPU, at least not yet.

The SARC will stay open, since Samsung has multiple product lines it develops at that facility, but Samsung itself will presumably transition back to using custom ARM cores, possibly with some custom IP baked into other aspects of the design. We know Samsung has licensed GPU technology from AMD, so we’ll likely still see custom silicon rolling out — just nothing related to full-on, custom CPU cores. ARM’s licensees do have some flexibility as far as how they design and build CPU cores, with allowances for differently sized caches and different I/O blocks to serve different markets, but the company won’t be doing nearly as much of its own customization in the future. Then again, going the customization route doesn’t seem to have served Samsung all that well in the first place.

A few years ago, it seemed as though everyone was going the customization route, with Qualcomm, Samsung, and Apple all fielding their own hardware in at least some SKUs. Now Qualcomm and Samsung are both back to ARM-licensed cores. Apple, meanwhile, continues designing and building its own silicon.

Now Read:

November 4th 2019, 4:39 pm

ET Deals: Samsung Evo 512GB MicroSDXC $77, Logitech MX Master Wireless Mouse $47, Dell Vostro Intel

ExtremeTech

Today you can get a microSDXC card with an enormous amount of high-speed storage for just $77.

Samsung Evo Select 512GB MicroSDXC ($77.99)

In addition to its large 512GB capacity, this microSDXC card is also fairly fast and able to transfer data at up to 100MB/s. Currently, you can get it from Amazon marked down from $94.99 to $77.99.

Logitech MX Master Wireless Mouse ($47.00)

Logitech’s MX Master versatile mouse can be used either wired or wirelessly. It also has an exceptional battery that can last up to 40 days and can gain sufficient charge to last for an entire day in just four minutes. Right now Amazon is offering it with a hefty discount that drops its price from $99.99 to $47.00.

Dell Vostro 3670 Intel Core i5-9400 Desktop w/ 8GB DDR4 RAM and 256GB HDD ($499.00)

This desktop comes equipped with a six-core processor that operates at up to 4.1GHz and offers solid performance for everyday tasks. This model also has 8GB of RAM and a 256GB SSD. Right now it’s marked down from $998.57 to $499.00 from Dell with promo code BFPEEK499.

Western Digital Elements 8TB USB 3.0 External HDD ($124.99)

This drives lets you carry around 8TB of data in a compact package that transfers data relatively quickly over USB 3.0. Right now it’s marked down from $179.99 to $124.99 on Amazon.

Dell UltraSharp 25 U2518D 25-Inch 2K IPS Monitor + $100 Dell eGift Card ($329.99)

This display was designed as an office solution with a native resolution of 2,560×1,440 and a built-in four port USB 3.0 hub. Though it is best used for work, it’s 5ms response time also makes it a decent solution for after hours gaming. Right now you can get it from Dell makred down from $449.99 to $329.99, and at that price it also comes with a $100 Dell eGift card.

23andMe DNA Test – Health + Ancestry Personal Genetic Service w/ Lab Fee Included ($99.00)

Taking a 23andMe DNA test can help you learn about your ancestry and also teach you about how your genetics affects your health. Right now you can get a test kit from Walmart marked down from $199.00 to $99.00.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

November 4th 2019, 3:28 pm

Microsoft’s Chromium-Based Edge Browser Officially Launches January 15, 2020

ExtremeTech

Microsoft has moved with surprising speed ever since it announced the move to the Chromium browser engine. It spent years toiling in an attempt to make EdgeHTML work, but it could never compete with the speed and near-universal compatibility of Chromium. Microsoft launched test versions of Chromium Edge earlier this year, and now we have a final date (and logo) for the new browser. It’s coming your way January 15, 2020.

Microsoft is taking a more Chrome-like approach to releasing its new browser. The version from early 2019 was akin to Chrome’s Canary build, and Microsoft later added developer and beta channels. You can download the beta right now, and you’ll continue getting updates every six weeks. However, it’s the stable version that most users will see because it’ll be part of Windows 10 in the future. You’ll be able to download the final build when it’s done, but Microsoft’s early 2020 update will probably have the browser bundled. 

For years, Microsoft tried to sell Edge and its custom engine as the most power-efficient way to browse on Windows. However, Chrome has continued to dominate the market because it’s just faster. The new Chromium Edge will be faster, of course, but it’ll also bring features like a dark UI mode, web “Collections,” and tight integration with Microsoft services. If you’ve been hoping for easier access to Bing search, the new Edge has you covered. Microsoft isn’t completely dropping legacy support, though. The new Edge will include an Internet Explorer mode that lets you run old web apps that won’t work on more modern browsers. 

While Edge uses the same underlying code as Google’s Chrome, Microsoft hopes to attract users who want more privacy. In the default “balanced” mode, Edge will block trackers from sites you haven’t visited, plus anything Microsoft identifies as potentially harmful. You can turn it down to “basic” to only block those malicious trackers. On the “strict” setting, Edge will block almost all trackers, but that might break some sites. Microsoft also promises to better obfuscate tracking in private browsing mode. 

With its new browser, Microsoft is also creating a new icon. Until now, Chromium Edge has been using the classic “e” icon, which was carried over from Internet Explorer. To fully break with the past, Microsoft has designed an “e” logo that looks like an Internet Explorer x Tide Pods collaboration. It’s a big improvement over the classic logo, though. You’ll start seeing that icon appear in your taskbar next year.

Now read:

November 4th 2019, 3:28 pm

Diablo 4 Is Coming to PC, Xbox One, and PS4 Soon

ExtremeTech

Fresh off the PR firestorm surrounding its suspension of a Hearthstone player for supporting the Hong Kong protests, Blizzard has finally announced the long-awaited Diablo 4 at Blizzcon 2019. We don’t have many details on the game, but the 9-minute cinematic trailer should get Diablo fans sufficiently hyped. 

The Diablo franchise stretches all the way back to 1996 when the original game marked a significant advance in the hack-and-slash RPG genre. Diablo pioneered many elements of the modern RPG that we take for granted, and both sequels have proved to be major successes in their own rights. 

For the uninitiated, the Diablo games take place in the world of Sanctuary, a realm created by an archangel for angels and devils who were sick of the eternal conflict between the High Heavens and the Burning Hells. 

In the newest installment, players will face the archdemon Lilith, who arrives in Sanctuary in grand fashion following the bloody sacrifice of three unlucky adventurers in the new trailer. The trailer, which you can see above, is over nine minutes long and features no gameplay footage. Instead, it’s a startlingly high-quality 3D animated prelude to the plot of the game. 

Following the cinematic trailer, Blizzard showed a much shorter gameplay trailer (below). Yes, it looks very pretty. It features just three classes: Sorceress, Druid, and Barbarian. There were many more classes in Diablo 3 at launch, and we expect to hear about more for Diablo 4 later on. 

Blizzard’s director for Diablo 4, Luis Barriga took the stage at Blizzcon to talk about the game, but he didn’t get too specific. Barriga confirmed the grim style of the trailer reflects the full game. You can expect a darker, more sinister experience in contrast to the brighter style of Diablo 3. The game will take players to never before explored corners of the Diablo universe, and there will be special PVP areas of the game at launch. Combat has also been tightened up considerably compared to previous installments of the series. 

Diablo 4 will launch on PC, Xbox One, and PS4 — sorry, no Switch version of this one. Blizzard doesn’t have a specific launch date in mind, but it shouldn’t be too far off. We’re getting on toward the end of this console generation, and Barriga specifically called out the current platforms.

Now read:

November 4th 2019, 1:24 pm
Get it on Google Play تحميل تطبيق نبأ للآندرويد مجانا