Microsoft Pulls Update Slowing Windows 10 PCs, Fall Update Coming Soon

I’ve updated several Windows 10 PCs to Chromium Edge, including testbeds, and I haven’t run across any issues doing so — but that doesn’t mean folks haven’t been having problems. Update KB4559309 apparently caused performance slowdowns for a number of people. That’s particularly relevant because KB4559309 can’t be uninstalled in the normal way.

Microsoft has released a new update to resolve the boot time issues introduced with its previous update, KB457675. KB457675 replaces KB4541301, KB4541302, and KB4559309, and we haven’t seen any reports of problems following swapping out to the newer version. The specific problem people have been reporting relates to machine boot-up speed, so if you’ve noticed a decrease in how quickly your machine restarts, this may be the cause.

Microsoft is prepping an update to Windows for October this year, after pushing back the spring update due to COVID-19. Updates expected for the (presumably named) 2010 update are small; Microsoft has moved to a model in which large updates arrive at the beginning of the year, while smaller updates are shunted towards the rear.

This time around, we know users should expect a tweaked start menu with semi-transparent backgrounds and new icons for built-in applications. Devices with detachable screens will now automatically switch to tablet mode when you remove the keyboard. The Your Phone app is being updated and will support a larger variety of Samsung products. You’ll also have the option to alt-tab through Edge. Finally, as previously reported, the “System” page is moving out of Control Panel and into Settings.

This type of shift has a lot of IT devs concerned, with good reason. Settings often doesn’t offer the same options as Control Panel, and not all advanced functions are available from within that applet. It’s not that people have a love affair with the Control Panel, as such, but we do need the replacement to actually provide the full functionality of the original.

As for reports of the update causing problems, we haven’t heard of any yet. That doesn’t mean it won’t — Microsoft’s bi-annual updates always cause a few problems for somebody — but the smaller updates should theoretically cause fewer issues on the whole. Microsoft will probably have an update for the Xbox Live app, at least, to coincide with the actual launch of the Xbox Series X and Xbox Series S.

Now Read:

from ExtremeTechExtremeTech https://www.extremetech.com/computing/315243-microsoft-pulls-update-slowing-windows-10-pcs-fall-update-coming-soon

from Blogger http://componentplanet.blogspot.com/2020/09/microsoft-pulls-update-slowing-windows.html

Coroner calls for review into ‘toxic’ DNP diet pills after man’s death

Vaidotas Gerbutavicius told his dad he would be “dead in an hour” after taking 20 DNP pills.

from BBC News – London https://www.bbc.co.uk/news/uk-england-london-54234896

from Blogger http://componentplanet.blogspot.com/2020/09/coroner-calls-for-review-into-toxic-dnp.html

There’s No Such Thing as ‘Huang’s Law’

Over the past decade, Nvidia has more or less invented the modern AI and machine-learning market. The company continues to make remarkable strides generation-on-generation and Ampere’s performance per dollar is very good. Nvidia currently has no serious competition in the GPU AI market.

But — having said all that — there’s no such thing as “Huang’s Law.” That’s the appellation assigned to Nvidia CEO Jensen Huang by the Wall Street Journal‘s Christopher Mims.

So, what is Huang’s Law? Well, it’s a misunderstood definition of Moore’s Law, but with the name “Huang” in front of it instead of “Moore.” Specifically:

“I call it Huang’s Law, after Nvidia Corp. chief executive and co-founder Jensen Huang. It describes how the silicon chips that power artificial intelligence more than double in performance every two years.”

Why the Definition Doesn’t Work

Mims begins his explanation by conflating Moore’s Law with Dennard scaling. Moore’s Law predicted that the number of transistors on a chip would double every two years. Dennard scaling predicted that building smaller transistors closer together would reduce their power consumption and allow for faster clocks. Moore’s Law is a measure of density. Dennard scaling measures performance per watt. It’s true that these two distinct discoveries are often combined in colloquial conversation, but in this specific case, conflating the two obfuscates the truth of the situation.

Mims writes: “Moore’s law has slowed, and some say it’s over. But a different law, potentially no less consequential for computing’s next half century, has arisen.”

As we’ve discussed a few times on this website, the meaning of Moore’s Law is complex and prone to periodic shifts. If you mistakenly conflate Moore’s Law and Dennard scaling, Moore’s Law has slowed a great deal. If you strictly consider Moore’s Law as a measure of transistor density, it’s actually kept close to its long-term historical pace. This chart from 1970 – 2018 makes that quite clear. What broke was Dennard scaling, which ended in roughly 2004.

Absolute CPU clock speeds have increased very slowly since 2006, as have TDPs. Transistor scaling, in contrast, has continued at a brisk pace.

To back up his argument, Mims turns to Bill Dally, Senior VP of research at Nvidia:

Between November 2012 and this May, performance of Nvidia’s chips increased 317 times for an important class of AI calculations, says Bill Dally, chief scientist and senior vice president of research at Nvidia. On average, in other words, the performance of these chips more than doubled every year, a rate of progress that makes Moore’s Law pale in comparison.

Image by Nvidia via WSJ

I’m the one that labeled this image — the original lacks labels — but based on the timeline, these are the GPUs the graph is likely referring to. Pascal launched in May 2016, Volta was announced in May 2017, and Turing shipped in the back half of 2018.

I’m going to ignore the fact that “an important class of AI calculations” is literally not a metric and treat the 317x performance claim as truthful. That’s an enormous increase in performance. The only trouble is, Huang’s Law is self-evidently dependent on Moore’s Law + the remains of Dennard scaling + the chum bucket of additional technologies like FinFET that engineers dump into every node to squeeze reasonable improvements out of it.

If you check the chart, most of Nvidia’s performance improvements are tied specifically to node transitions. Turing is the only exception. Nvidia has significantly improved performance without a node transition twice in recent history — first from Kepler to Maxwell (the first tiny bump just before 2015) and then from Volta to Turing. But good as Nvidia is at wringing additional process from the same node, you can also see how important new process nodes have been to Nvidia’s overall performance. Huang’s Law, if it existed, could not be a replacement for Moore’s Law. Huang’s Law is enabled by Moore’s Law.

As the benefit of node transitions drop, the rate of AI performance improvement is going to slow.

Why Huang’s Law Doesn’t Exist

First, the existence of an independent Huang’s Law is an illusion. Despite Dally’s comments about moving well ahead of Moore’s Law, it would be far more accurate to say “Nvidia has taken advantage of Moore’s Law to boost transistor density, while simultaneously improving total device performance at an effectively faster rate than Dennard scaling alone would have predicted.”

Huang’s Law can’t exist independently of Moore’s Law. If Moore’s Law is in trouble — either in terms of transistor scaling or the loosely defined performance-improvement inclusions, Huang’s Law is, too. TSMC has forecast only limited performance improvements at 5nm and below, and that’s going to have an impact on how much performance each new generation of product can deliver. This is going to put more pressure on Nvidia’s engineers to squeeze better performance out at a per-transistor level, and humans aren’t actually very good at that.

Second, it’s too early to make this kind of determination. When Gordon Moore published his first paper in 1965, he examined the time period from 1959 – 1964. Later, in 1975, he revised his paper again, and increased the expected time to double from one year to two. That same year, Caltech professor Carver Mead popularized the term “Moore’s Law.” By the time he did, the “law” had been in effect for about 16 years. If we look at the WSJ’s representation of Nvidia’s timeline, either Pascal or Volta was the first GPU to really offer any kind of useful AI/ML performance. “Huang’s Law” is all of 3-4 years old. Even if we use Dally’s 2012 figure, it’s just eight years old. It’s a premature declaration.

Third, it’s not clear that AI/ML improvement can continue to grow at its present rate, even if we assume Moore’s Law improvements continue to deliver substantial benefits. Adding support for features like FP16 and INT8 allows AMD, Nvidia, and Intel to increase AI performance by executing more instructions in a single clock cycle, but not every type of workload delivers suitable results this way and there aren’t an infinite number of useful, ever-smaller low-precision targets to choose from. Over the last few years, manufacturers have been very busy picking low-hanging fruit. Eventually, we’re going to run out. We can’t subdivide a floating-point standard down to “FP0.0025” in an attempt to build a hyper-efficient neural net. Amazon, Google, Facebook, and similar companies do not have an infinite amount of space to devote to building ever-larger AI networks.

Consider the smartphone. Ten years ago, it was not unusual for a new smartphone to double or nearly-double the performance of its predecessor, to say nothing of the visual upgrade once “Retina” displays hit the market. That doesn’t happen any longer. The rate of improvement, which was meteoric in the beginning, has slowed.

If a person had proposed a “Job’s Law” of smartphone performance improvement back in late 2010 based on the rate of improvement from the iPhone -> iPhone 3G -> iPhone 3GS -> iPhone 4, they would look pretty silly in 2020.

Again, this is not some knock against Nvidia. The AI/ML market has exploded quickly, with dozens of companies working on silicon, and Nvidia has led the entire industry. Jensen Huang is an incredibly successful CEO. But with Dennard Scaling gone, low-hanging fruit being quickly gathered, and TSMC warning of fewer performance improvements on future nodes, it’s premature to be declaring that anyone has established any kind of law governing long-term performance growth. Dennard scaling lasted for decades. Moore’s Law (again, strictly defined in terms of density) is still chugging along 61 years later.

I say we give it a decade. If Huang’s Law is a real thing now, it’ll still be a real thing in 2030. If it isn’t, it never existed in the first place. No matter what the answer is, Jensen Huang will still go down as one of the business leaders who pioneered artificial intelligence and machine learning.

Now Read:

from ExtremeTechExtremeTech https://www.extremetech.com/computing/315277-theres-no-such-thing-as-huangs-law

from Blogger http://componentplanet.blogspot.com/2020/09/theres-no-such-thing-as-huangs-law.html

Astronomers Find ‘Pi Planet’ With 3.14-Day Orbit

The longer we study the universe, the more exoplanets we find. Many of these discoveries are notable because of how Earth-like they are or because of the number of planets crammed into a single solar system. The rocky planet K2-315b, on the other hand, is notable because of its orbital period. It takes 3.14 Earth days to complete an orbit of its star. Astronomers have therefore dubbed it “pi planet.”

The planet’s name gives a hint of its origins. This is the 315th exoplanet discovered in the data from the Kepler K2 mission. That was the second phase of Kepler’s life after several of its components failed, limiting its ability to remain pointed in any one direction. That was a problem for its planet-hunting activities, but NASA managed to partially revive it by using the solar wind to stabilize Kepler along several parts of its orbit. 

Kepler used the transit method to find planets, which requires scanning distant stars for long periods of time to monitor for dips in light. Those dips can signal a planet has passed in front of the star (known as EPIC 249631677). Therefore, the transmit method is best at detecting larger planets that orbit close to the star. Even though Kepler shut down some time ago, teams like the one at MIT are still poring over its data in search of new planets like K2-315b. 

This new world is closer to its star than Mercury is to the sun, but it’s otherwise potentially Earth-like. The upshot, of course, is that its year is just 3.14 Earth days. Yes, it’s an arbitrary human-centric measurement, but it’s still fun. 

How to use sunlight (photon pressure) as Kepler's third reaction wheel

Astronomers estimate K2-315b to be 0.95 Earth radii, but the team hasn’t determined the mass yet. Regardless, it’s not looking like a pleasant environment. Because K2-315b is so close to the star, it has a surface temperature of about 350 degrees Fahrenheit (176 degrees Celsius). As MIT points out, that’s hot enough to bake real pies. 

Data from the K2 mission is often not enough to confirm a planet on its own. The MIT researchers used the SPECULOOS telescope array, which consists of five 1-meter telescopes (four in Chile and one on the largest of the Canary Islands). After nailing down a time when they were likely to catch a transit, the team pointed the array at EPIC 249631677. Sure enough, they spotted the pi planet with its coincidental orbit. 

The star is about 185 light-years away, which isn’t far in the grand scheme. Future instruments like the James Webb Space Telescope might be able to get a better look at this rocky mathematical happenstance.

Now read:

from ExtremeTechExtremeTech https://www.extremetech.com/extreme/315256-astronomers-find-pi-planet-with-3-14-day-orbit

from Blogger http://componentplanet.blogspot.com/2020/09/astronomers-find-pi-planet-with-314-day.html

Gareth Bale could stay longer than season at Tottenham, says forward’s agent

Gareth Bale could stay at Tottenham for longer than the season-long loan the club have agreed with Real Madrid, says his agent.

from BBC News – London https://www.bbc.co.uk/sport/football/54243742

from Blogger http://componentplanet.blogspot.com/2020/09/gareth-bale-could-stay-longer-than.html

ET Deals: Samsung Q60 75-Inch QLED 4K TV for $1,199, Half-off Dell Vostro 14 5401 Intel Core i7 Laptop

Today only you can save $300 on a large 75-inch 4K Samsung TV. There’s also an incredible deal on a Dell Vostro laptop with a Core i7 processor that’s available with the price cut in half.

Samsung QN75Q60RAFXZA Q60 QLED 75-Inch 4K TV ($1,199.99)

Samsung designed this TV with a large format 75-inch 4K panel giving you a large screen to enjoy your favorite shows on. THe TV also is equipped with Samsung’s Quantum AI Processor 4K, which helps to upscale content to 4K, and the TV can also stream content wireless from dozens of sources. For today only, you can get one on sale from Amazon marked down from $1,499.99 to $1,199.99.

Dell Vostro 14 5401 Intel Core i7-1065G7 14-Inch 1080p Laptop w/ Nvidia GeForce MX330 GPU, 8GB DDR4 RAM and 256GB NVMe SSD (737.09)

Dell’s Vostro 14 5401 is currently an excepetional deal as Dell has heavily discounted the system from $1,498.57 to just $737.09 with promo code STAND4SMALL. For this price, you get a very capable PC with an Intel Core i7-1065G7 processor and an Nvidia GeForce MX330 GPU. This processor offers exceptional performance for everyday tasks, whereas the GPU is less powerful but it should work work well for a little light gaming.

Gigabyte G27F 27-Inch 1080p 144Hz Gaming Monitor ($199.99)

Gigabyte designed this gaming monitor with support for FreeSync technology, which can help create a smooth gaming experience. The display also can run with a fast refresh rate of 144Hz making it an excellent option for gaming, and it also has support for 125 percent of the sRGB color space making it a fitting solution for image editing work. Overall it’s a very well rounded display, and it’s now on sale from Newegg marked down from $249.99 to just $199.99 with promo code 9SMARTHM54 and a $30 mail-in rebate.

Dell Inspiron 3880 Intel Core i5-10400 Desktop w/ 8GB DDR4 RAM and 512GB NVMe SSD ($522.89)

This desktop features a fast core i5 processor and 8GB of DDR4 RAM, which makes it an excellent option for a fast office PC. It should be able to run multiple applications at the same time with ease, and it also has a fast 512GB SSD. Currently, you can get this system from Dell marked down from $634.98 to $522.89 with promo code SAVE17.

Western Digital Black SN750 500GB M.2 NVMe SSD ($69.99)

This WD M.2 SSD has a capacity of 500GB and it can transfer data at a rate of up to 3,430MB/s. This makes it significantly faster than a 2.5-inch SSD, and it’s also fairly inexpensive, marked down at Amazon  from $129.99 to $69.99.

Sega Genesis Mini Console ($47.51)

Sega’s miniature Genesis console comes loaded with 42 classic Sega Genesis games, including hits like Sonic The Hedgehog, Castlevania, and Altered Beast. The system also comes with two controls for playing games with a friend, and right now you can get it with a discount that drops the price from $79.99 to just $47.51. At a time when mainstream game consoles like the Xbox, PlayStation, and Nintendo Switch are selling out everywhere, this is an exceptional deal.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

from ExtremeTechExtremeTech https://www.extremetech.com/deals/315265-et-deals-samsung-q60-75-inch-qled-4k-tv-for-1199-half-off-dell-vostro-14-5401-intel-core-i7-laptop

from Blogger http://componentplanet.blogspot.com/2020/09/et-deals-samsung-q60-75-inch-qled-4k-tv.html