×
Google

20 Years of Gmail (theverge.com) 86

Victoria Song reports via The Verge: When Gmail launched with a goofy press release 20 years ago next week, many assumed it was a hoax. The service promised a gargantuan 1 gigabyte of storage, an excessive quantity in an era of 15-megabyte inboxes. It claimed to be completely free at a time when many inboxes were paid. And then there was the date: the service was announced on April Fools' Day, portending some kind of prank. But soon, invites to Gmail's very real beta started going out -- and they became a must-have for a certain kind of in-the-know tech fan. At my nerdy high school, having one was your fastest ticket to the cool kids' table. I remember trying to track one down for myself. I didn't know whether I actually needed Gmail, just that all my classmates said Gmail would change my life forever.

Teenagers are notoriously dramatic, but Gmail did revolutionize email. It reimagined what our inboxes were capable of and became a central part of our online identities. The service now has an estimated 1.2 billion users -- about 1/7 of the global population -- and these days, it's a practical necessity to do anything online. It often feels like Gmail has always been here and always will be. But 20 years later, I don't know anyone who's champing at the bit to open up Gmail. Managing your inbox is often a chore, and other messaging apps like Slack and WhatsApp have come to dominate how we communicate online. What was once a game-changing tool sometimes feels like it's been sidelined. In another 20 years, will Gmail still be this central to our lives? Or will it -- and email -- be a thing of the past?

Software

Proxmox Import Wizard Makes for Easy VMware VM Migrations (storagereview.com) 39

Lyle Smith reports via StorageReview.com: Proxmox has introduced a new import wizard for Proxmox Virtual Environment (VE), aiming to simplify the migration process for importing VMware ESXi VMs. This new feature comes at an important time in the industry, as it aims to ease the transition for these organizations looking to move away from VMware's vSphere due to high renewal costs.

The new import wizard is integrated into Proxmox VE's existing storage plugin system, allowing for direct integration into the platform's API and web-based user interface. It offers users the ability to import VMware ESXi VMs in their entirety, translating most of the original VM's configuration settings to Proxmox VE's configuration model (all while minimizing downtime). Currently, the import wizard is in a technical preview state, having been added during the Proxmox VE 8.2 development cycle. Although it is still under active development, early reports suggest the wizard is stable and holds considerable promise for future enhancements, including the planned addition of support for other import sources like OVF/OVA files. [...]

This tool represents Proxmox's commitment to providing accessible, open-source virtualization solutions. By leveraging the official ESXi API and implementing a user space filesystem with optimized read-ahead caching in Rust (a safe, fast, and modern programming language ideal for system-level tasks), Proxmox aims to ensure that this new feature can be integrated smoothly into its broader ecosystem.

Cloud

Cloud Server Host Vultr Rips User Data Ownership Clause From ToS After Web Outage (theregister.com) 28

Tobias Mann reports via The Register: Cloud server provider Vultr has rapidly revised its terms-of-service after netizens raised the alarm over broad clauses that demanded the "perpetual, irrevocable, royalty-free" rights to customer "content." The red tape was updated in January, as captured by the Internet Archive, and this month users were asked to agree to the changes by a pop-up that appeared when using their web-based Vultr control panel. That prompted folks to look through the terms, and there they found clauses granting the US outfit a "worldwide license ... to use, reproduce, process, adapt ... modify, prepare derivative works, publish, transmit, and distribute" user content.

It turned out these demands have been in place since before the January update; customers have only just noticed them now. Given Vultr hosts servers and storage in the cloud for its subscribers, some feared the biz was giving itself way too much ownership over their stuff, all in this age of AI training data being put up for sale by platforms. In response to online outcry, largely stemming from Reddit, Vultr in the past few hours rewrote its ToS to delete those asserted content rights. CEO J.J. Kardwell told The Register earlier today it's a case of standard legal boilerplate being taken out of context. The clauses were supposed to apply to customer forum posts, rather than private server content, and while, yes, the terms make more sense with that in mind, one might argue the legalese was overly broad in any case.

"We do not use user data," Kardwell stressed to us. "We never have, and we never will. We take privacy and security very seriously. It's at the core of what we do globally." [...] According to Kardwell, the content clauses are entirely separate to user data deployed in its cloud, and are more aimed at one's use of the Vultr website, emphasizing the last line of the relevant fine print: "... for purposes of providing the services to you." He also pointed out that the wording has been that way for some time, and added the prompt asking users to agree to an updated ToS was actually spurred by unrelated Microsoft licensing changes. In light of the controversy, Vultr vowed to remove the above section to "simplify and further clarify" its ToS, and has indeed done so. In a separate statement, the biz told The Register the removal will be followed by a full review and update to its terms of service.
"It's clearly causing confusion for some portion of users. We recognize that the average user doesn't have a law degree," Kardwell added. "We're very focused on being responsive to the community and the concerns people have and we believe the strongest thing we can do to demonstrate that there is no bad intent here is to remove it."
Facebook

Facebook Allegedly Killed Its Own Streaming Service To Help Sell Netflix Ads (gizmodo.com) 14

An anonymous reader quotes a report from Gizmodo: Do you remember Facebook Watch? Me neither. Mark Zuckerberg's short-lived streaming service never really got off the ground, but court filings unsealed in Meta's antitrust lawsuit claim "Watch" was kneecapped starting in 2018 to protect Zuckerberg's advertising relationship with Netflix CEO Reed Hastings. "For nearly a decade, Netflix and Facebook enjoyed a special relationship," said plaintiffs in filings (PDF) made public on Saturday. "It is no great mystery how this close partnership developed, and who was its steward: from 2011-2019, Netflix's then-CEO Hastings sat on Facebook's board and personally directed the companies' relationship"

The filings detail Hastings' uncomfortably close relationship with Meta's upper management, including Zuckerberg and Sheryl Sandberg. During these years, Netflix was allegedly granted special access to Facebook users' private message inboxes, among other privileged analytics tools, in exchange for hundred-million-dollar advertising deals. This gave Facebook greater dominance in its all-important ad division, plaintiffs allege, so the company was fine to retreat from Netflix's streaming territory by shuttering Watch. In 2017, Facebook Watch began signing deals to populate its streaming service with original TV Shows from movie stars such as Bill Murray. A year later, the service attempted to license the popular '90s TV show Dawson's Creek. Facebook Watch had meaningful reach on the home screen of the social media platform, and an impressive budget as well. Facebook and Netflix appeared ready to butt heads in the streaming world, and the Netflix cofounder found himself in the middle as a Facebook board member. [...]

Netflix was a large advertiser to Facebook, and plaintiffs allege Zuckerberg shuttered its promising Watch platform for the sake of the greater advertising business. Zuckerberg personally emailed the head of Facebook Watch in May of 2018, Fidji Simo, to tell her their budget was being slashed by $750 million, just two years after Watch's launch, according to court filings. The sudden pivot meant Facebook was now dismantling the streaming business it had spent the last two years growing. During this time period, Netflix increased its ad spend on Facebook to roughly $150 million a year and allegedly entered into agreements for increased data analytics. By early 2019, the ad spend increased to roughly $200 million a year. Hastings left Facebook's board later in 2019.

UPDATE: Meta (Again) Denies Netflix Read Facebook Users' Private Messenger Messages.
AI

The AI Boom is Sending Silicon Valley's Talent Wars To New Extremes (wsj.com) 26

Tech companies are serving up million-dollar-a-year compensation packages, accelerated stock-vesting schedules and offers to poach entire engineering teams to draw people with expertise and experience in the kind of generative AI that is powering ChatGPT and other humanlike bots. They are competing against each other and against startups vying to be the next big thing to unseat the giants. From a report: The offers stand out even by the industry's relatively lavish past standards of outsize pay and perks. And the current AI talent shortage stands out for another reason: It is happening as layoffs are continuing in other areas of tech and as companies have been reallocating resources to invest more in covering the enormous cost of developing AI technology.

"There is a secular shift in what talents we're going after," says Naveen Rao, head of Generative AI at Databricks. "We have a glut of people on one side and a shortage on the other." Databricks, a data storage and management startup, doesn't have a problem finding software engineers. But when it comes to candidates who have trained large language models, or LLMs, from scratch or can help solve vexing problems in AI, such as hallucinations, Rao says there might be only a couple of hundred people out there who are qualified.

Some of these hard-to-find, tier-one candidates can easily get total compensation packages of $1 million a year or more. Salespeople in AI are also in demand and hard to find. Selling at the beginning of a technology transition when things are changing rapidly requires a different skill set and depth of knowledge. Candidates with those skills are making around double what an enterprise software salesperson would. But that isn't the norm for most people working in AI, Rao says. For managerial roles in AI and machine learning, base-pay increases ranged from 5% to 11% from April 2022 to April 2023, according to a WTW survey of more than 1,500 employers. The base-pay increases of nonmanagerial roles ranged from 13% to 19% during the same period.

Bug

macOS Sonoma 14.4 Bug 'Destroys Saved Versions In iCloud Drive' (macrumors.com) 32

The macOS Sonoma 14.4 update introduces a bug affecting iCloud Drive's versioning system, where users with "Optimize Mac Storage" enabled can lose all previous versions of a file removed from local storage. MacRumors reports: Versions are normally created automatically when users save files using apps that work with the version system in macOS. According to The Eclectic Light Company's Howard Oakley, users running macOS 14.4 that have "Optimize Mac Storage" enabled should be aware that they are at risk of losing all previously saved versions of a file if they opt to remove it from iCloud Drive local storage: "In previous versions of macOS, when a file is evicted from local storage in iCloud Drive [using the Remove Download option in the right-click contextual menu], all its saved versions have been preserved. Download that file again from iCloud Drive, and versions saved on that Mac (but not other Macs or devices) have remained fully accessible. Do that in 14.4, and all previous versions are now removed, and lost forever."

Oakley said his own tests confirmed that this behavior does not happen in macOS Sonoma 14.3 or macOS Ventura, so it is exclusive to macOS 14.4. For users who have already updated, he suggests either not saving files to iCloud Drive at all, or turning off Optimize Mac Storage. To perform the latter in System Settings, click your Apple ID, select iCloud, and then toggle off the switch next to "Optimize Mac Storage." You may need to perform this action twice -- reports suggest it can turn back on by itself. For a more exhaustive account of the problem, see Oakley's subsequent post.

Piracy

BitTorrent Is No Longer the 'King' of Upstream Internet Traffic (torrentfreak.com) 37

An anonymous reader quotes a report from TorrentFreak: Back in 2004, in the pre-Web 2.0 era, research indicated that BitTorrent was responsible for an impressive 35% of all Internet traffic. At the time, file-sharing via peer-to-peer networks was the main traffic driver as no other services consumed large amounts of bandwidth. Fast-forward two decades and these statistics are ancient history. With the growth of video streaming, including services such as YouTube, Netflix, and TikTok, file-sharing traffic is nothing more than a drop in today's data pool. [...]

This week, Canadian broadband management company Sandvine released its latest Global Internet Phenomena Report which makes it clear that BitTorrent no longer leads any charts. The latest data show that video and social media are the leading drivers of downstream traffic, accounting for more than half of all fixed access and mobile data worldwide. Needless to say, BitTorrent is nowhere to be found in the list of 'top apps'. Looking at upstream traffic, BitTorrent still has some relevance on fixed access networks where it accounts for 4% of the bandwidth. However, it's been surpassed by cloud storage apps, FaceTime, Google, and YouTube. On mobile connections, BitTorrent no longer makes it into the top ten. The average of 46 MB upstream traffic per subscriber shouldn't impress any file-sharer. However, since only a small percentage of all subscribers use BitTorrent, the upstream traffic per user is of course much higher.

Math

Pi Calculated to 105 Trillion Digits. (Stored on 1 Petabyte of SSDs) (solidigm.com) 95

Pi was calculated to 100 trillion decimal places in 2022 by a Google team lead by cloud developer advocate Emma Haruka Iwao.

But 2024's "pi day" saw a new announcement... After successfully breaking the speed record for calculating pi to 100 trillion digits last year, the team at StorageReview has taken it up a notch, revealing all the numbers of Pi up to 105 trillion digits! Spoiler: the 105 trillionth digit of Pi is 6!

Owner and Editor-in-Chief Brian Beeler led the team that used 36 Solidigm SSDs (nearly a petabyte) for their unprecedented capacity and reliability required to store the calculated digits of Pi. Although there is no practical application for this many digits, the exercise underscores the astounding capabilities of modern hardware and an achievement in computational and storage technology...

For an undertaking of this size, which took 75 days, the role of storage cannot be understated. "For the Pi computation, we're entirely restricted by storage, says Beeler. "Faster CPUs will help accelerate the math, but the limiting factor to many new world records is the amount of local storage in the box. For this run, we're again leveraging Solidigm D5-P5316 30.72TB SSDs to help us get a little over 1P flash in the system.

"These SSDs are the only reason we could break through the prior records and hit 105 trillion Pi digits."

"Leveraging a combination of open-source and proprietary software, the team at StorageReview optimized the algorithmic process to fully exploit the hardware's capabilities, reducing computational time and enhancing efficiency," Beeler says in the announcement.

There's a video on YouTube where the team discusses their effort.
Power

Caffeine Makes Fuel Cells More Efficient, Cuts Cost of Energy Storage (theregister.com) 40

Dan Robinson reports via The Register: Adding caffeine can enhance the efficiency of fuel cells, reducing the need for platinum in electrodes and significantly reducing the cost of making them, according to researchers in Japan. [...] The study, published in the journal Communications Chemistry, concerns the catalysis process at the cathode of a fuel cell and making this reaction more efficient. Fuel cells work somewhat like batteries. They generate power by converting the chemical energy of a fuel (or electrolyte) and an oxidizing agent into electricity. This is typically hydrogen as a fuel and oxygen as an oxidizer. Unlike batteries with limited lifespans, fuel cells can generate power as long as fuel is supplied. The hydrogen undergoes oxidation at the anode, producing hydrogen ions and electrons. The ions move through the hydrogen electrolyte to the cathode, while the electrons flow through an external circuit, generating electricity. At the cathode, oxygen combines with the hydrogen ions and electrons, resulting in water as a by-product. However, this water impacts the performance of the fuel cell, reacting with the platinum (Pt) to form a layer of platinum hydroxide (PtOH) on the electrode and interfering with the catalysis of the oxygen reduction reaction (ORR), according to the researchers.

To maintain efficient operation, fuel cells require a high Pt loading (greater platinum content), which significantly ups the costs of fuel cells. A quick look online found market prices for platinum of $29.98 per gram, or $932.61 per ounce, at the time of writing. The researchers found that adding caffeine can improve the ORR activity of platinum electrodes 11 fold, making the reaction more efficient. If you are wondering (as we were) how they came to be experimenting with this, the paper explains that modifying electrodes with hydrophobic material is known to be an effective method for enhancing ORR. Caffeine is less toxic than other hydrophobic substances, and it activates the hydrogen evolution and oxidation reactions of Pt nanoparticles and caffeine doped carbons. Got that?

Chiba University's work was led by Professor Nagahiro Hoshi at the Department of Applied Chemistry and Biotechnology. He explained that the researchers found a notable improvement in the electrode's ORR activity with an increase in caffeine concentration in the electrolyte. This forms a thin layer on the electrode's surface, effectively preventing the formation of PtOH, but the effect depends on the orientation of the platinum atoms on the electrode's surface. The paper refers to these as Pt(100), Pt(110) and Pt(111), with the latter two showing increased ORR activity, while there was no noticeable effect with Pt(100). The researchers do not explain if this latter effect might be a problem, but instead claim that their discovery has the potential to improve the designs of fuel cells and lead to more widespread adoption.

Software

Ethereum Network Completes Cost-Cutting 'Dencun' Software Upgrade (bloomberg.com) 15

Ethereum has successfully completed a major software upgrade that should make using the blockchain network ecosystem cheaper. The update enables transactions that previously cost $1 on linked Layer 2 networks such as Arbitrum, Polygon, and Coinbase's Base to be executed for just a cent.

The Dencun upgrade, a combination of the "Deneb" and "Cancun" portions of the update, introduces a new data storage system for Ethereum. Currently, most Layer 2 blockchains store data on Ethereum, and because this data is stored permanently on every Ethereum node, storage costs often account for around 90% of Layer 2 expenses. These costs are typically passed on to applications, which in turn charge consumers. With Dencun, Layer 2s can now store data in a new type of repository called blobs, which will be cheaper as the data will only be stored for approximately 18 days.
AI

"We Asked Intel To Define 'AI PC.' Its reply: 'Anything With Our Latest CPUs'" (theregister.com) 35

An anonymous reader shares a report: If you're confused about what makes a PC an "AI PC," you're not alone. But finally have something of an answer: if it packs a GPU, a processor that boasts a neural processing unit and can handle VNNI and Dp4a instructions, it qualifies -- at least according to Robert Hallock, Intel's senior director of technical marketing. As luck would have it, that combo is present in Intel's current-generation desktop processors -- 14th-gen Core, aka Core Ultra, aka "Meteor Lake." All models feature a GPU, NPU, and can handle Vector Neural Network Instructions (VNNI) that speed some -- surprise! -- neural networking tasks, and the DP4a instructions that help GPUs to process video.

Because AI PCs are therefore just PCs with current processors, Intel doesn't consider "AI PC" to be a brand that denotes conformity with a spec or a particular capability not present in other PCs. Intel used the "Centrino" brand to distinguish Wi-Fi-enabled PCs, and did likewise by giving home entertainment PCs the "Viiv" moniker. Chipzilla still uses the tactic with "vPro" -- a brand that denotes processors that include manageability and security for business users. But AI PCs are neither a brand nor a spec. "The reason we have not created a category for it like Centrino is we believe this is simply what a PC will be like in four or five years time," Hallock told The Register, adding that Intel's recipe for an AI PC doesn't include specific requirements for memory, storage, or I/O speeds. "There are cases where a very large LLM might require 32GB of RAM," he noted. "Everything else will fit comfortably in a 16GB system."

Transportation

Automakers Are Sharing Consumers' Driving Behavior With Insurance Companies (nytimes.com) 229

An anonymous reader quotes a report from the New York Times: Kenn Dahl says he has always been a careful driver. The owner of a software company near Seattle, he drives a leased Chevrolet Bolt. He's never been responsible for an accident. So Mr. Dahl, 65, was surprised in 2022 when the cost of his car insurance jumped by 21 percent. Quotes from other insurance companies were also high. One insurance agent told him his LexisNexis report was a factor. LexisNexis is a New York-based global data broker with a "Risk Solutions" division that caters to the auto insurance industry and has traditionally kept tabs on car accidents and tickets. Upon Mr. Dahl's request, LexisNexis sent him a 258-page "consumer disclosure report," which it must provide per the Fair Credit Reporting Act. What it contained stunned him: more than 130 pages detailing each time he or his wife had driven the Bolt over the previous six months. It included the dates of 640 trips, their start and end times, the distance driven and an accounting of any speeding, hard braking or sharp accelerations. The only thing it didn't have is where they had driven the car. On a Thursday morning in June for example, the car had been driven 7.33 miles in 18 minutes; there had been two rapid accelerations and two incidents of hard braking.

According to the report, the trip details had been provided by General Motors -- the manufacturer of the Chevy Bolt. LexisNexis analyzed that driving data to create a risk score "for insurers to use as one factor of many to create more personalized insurance coverage," according to a LexisNexis spokesman, Dean Carney. Eight insurance companies had requested information about Mr. Dahl from LexisNexis over the previous month. "It felt like a betrayal," Mr. Dahl said. "They're taking information that I didn't realize was going to be shared and screwing with our insurance." In recent years, insurance companies have offered incentives to people who install dongles in their cars or download smartphone apps that monitor their driving, including how much they drive, how fast they take corners, how hard they hit the brakes and whether they speed. But "drivers are historically reluctant to participate in these programs," as Ford Motor put it in apatent application (PDF) that describes what is happening instead: Car companies are collecting information directly from internet-connected vehicles for use by the insurance industry.

Sometimes this is happening with a driver's awareness and consent. Car companies have established relationships with insurance companies, so that if drivers want to sign up for what's called usage-based insurance -- where rates are set based on monitoring of their driving habits -- it's easy to collect that data wirelessly from their cars. But in other instances, something much sneakier has happened. Modern cars are internet-enabled, allowing access to services like navigation, roadside assistance and car apps that drivers can connect to their vehicles to locate them or unlock them remotely. In recent years, automakers, including G.M., Honda, Kia and Hyundai, have started offering optional features in their connected-car apps that rate people's driving. Some drivers may not realize that, if they turn on these features, the car companies then give information about how they drive to data brokers like LexisNexis. Automakers and data brokers that have partnered to collect detailed driving data from millions of Americans say they have drivers' permission to do so. But the existence of these partnerships is nearly invisible to drivers, whose consent is obtained in fine print and murky privacy policies that few read. Especially troubling is that some drivers with vehicles made by G.M. say they were tracked even when they did not turn on the feature -- called OnStar Smart Driver -- and that their insurance rates went up as a result.

Data Storage

Study Finds That We Could Lose Science If Publishers Go Bankrupt (arstechnica.com) 66

A recent survey found that academic organizations are failing to preserve digital material -- "including science paid for with taxpayer money," reports Ars Technica, highlighting the need for improved archiving standards and responsibilities in the digital age. From the report: The work was done by Martin Eve, a developer at Crossref. That's the organization that organizes the DOI system, which provides a permanent pointer toward digital documents, including almost every scientific publication. If updates are done properly, a DOI will always resolve to a document, even if that document gets shifted to a new URL. But it also has a way of handling documents disappearing from their expected location, as might happen if a publisher went bankrupt. There are a set of what's called "dark archives" that the public doesn't have access to, but should contain copies of anything that's had a DOI assigned. If anything goes wrong with a DOI, it should trigger the dark archives to open access, and the DOI updated to point to the copy in the dark archive. For that to work, however, copies of everything published have to be in the archives. So Eve decided to check whether that's the case.

Using the Crossref database, Eve got a list of over 7 million DOIs and then checked whether the documents could be found in archives. He included well-known ones, like the Internet Archive at archive.org, as well as some dedicated to academic works, like LOCKSS (Lots of Copies Keeps Stuff Safe) and CLOCKSS (Controlled Lots of Copies Keeps Stuff Safe). The results were... not great. When Eve broke down the results by publisher, less than 1 percent of the 204 publishers had put the majority of their content into multiple archives. (The cutoff was 75 percent of their content in three or more archives.) Fewer than 10 percent had put more than half their content in at least two archives. And a full third seemed to be doing no organized archiving at all. At the individual publication level, under 60 percent were present in at least one archive, and over a quarter didn't appear to be in any of the archives at all. (Another 14 percent were published too recently to have been archived or had incomplete records.)

The good news is that large academic publishers appear to be reasonably good about getting things into archives; most of the unarchived issues stem from smaller publishers. Eve acknowledges that the study has limits, primarily in that there may be additional archives he hasn't checked. There are some prominent dark archives that he didn't have access to, as well as things like Sci-hub, which violates copyright in order to make material from for-profit publishers available to the public. Finally, individual publishers may have their own archiving system in place that could keep publications from disappearing. The risk here is that, ultimately, we may lose access to some academic research.

Microsoft

Microsoft Sends OneDrive URL Upload Feature To the Cloud Graveyard (theregister.com) 13

Microsoft has abruptly pulled a feature from OneDrive that allows users to upload files to the cloud storage service directly from a URL. From a report: The feature turned up as a preview in 2021 and was intended for scenarios "where the file contents aren't available, or are expensive to transfer," according to Microsoft. It was particularly useful for mobile users, for whom uploading files directly through their apps could be costly. Much better to simply point OneDrive at a given URL and let it handle the upload itself.

However, the experimental feature never made it past the consumer version of OneDrive. It also didn't fit with Microsoft's "vision for OneDrive as a cloud storage service that syncs your files across devices." Indeed, the idea of hosing data into OneDrive from a remote source sits at odds with the file synchronization model being championed by Microsoft and conveniently available from macOS and Windows.

Cloud

Propose Class Action Alleges Apple's Cloud Storage is an 'Illegal Monopoly' (thehill.com) 169

"Apple faces a proposed class action lawsuit alleging the company holds an illegal monopoly over digital storage for its customers," reports the Hill: The suit, filed Friday, claims "surgical" restraints prevent customers from effectively using any service except its iCloud storage system. iCloud is the only service that can host certain data from the company's phones, tablets and computers, including application data and device settings. Plaintiffs allege the practice has "unlawfully 'tied'" the devices and iCloud together... "As a result of this restraint, would-be cloud competitors are unable to offer Apple's device holders a full-service cloud-storage solution, or even a pale comparison."
The suit argues that there are "no technological or security justifications for this limitation on consumer choice," according to PC Magazine.

The class action's web site is arguing that "Consumers may have paid higher prices than they allegedly would have in a competitive market."
Windows

Microsoft Begins Adding 'Copilot' Icon to Windows 11 Taskbars (techrepublic.com) 81

Microsoft is "delighted to introduce some useful new features" for its "Copilot Preview for Windows 11," according to a recent blog post.

TechRepublic adds that "most features will be enabled by default... rolling out from today until April 2024." Windows 11 users will be able to change system settings through prompts typed directly into Copilot in Windows, currently accessible in the Copilot Preview via an icon on the taskbar, or by pressing Windows + C. Microsoft Copilot will be able to perform the following actions:

- Turn on/off battery saver.
- Show device information.
- Show system information.
- Show battery information.
- Open storage page.
- Launch Live Captions.
- Launch Narrator.
- Launch Screen Magnifier.
- Open Voice Access page.
- Open Text size page.
- Open contrast themes page.
- Launch Voice input.
- Show available Wi-Fi network.
- Display IP Address.
- Show Available Storage.

The new third-party app integrations for Copilot will give Windows 11 users new ways to interact with various applications. For example, making business lunch reservations through OpenTable...

Other new AI features for Windows 11 rolling out today include a new, AI-powered Generative Erase tool, which sounds reminiscent of Google's Magic Eraser tool for Google Photos. Generative Erase allows users to remove unwanted objects or artifacts from their photos in the Photos app.

Likewise, Microsoft's video editing tool Clipchamp is receiving a Silence Removal tool, which functions much as the name implies  — it allows users to remove gaps in conversation or audio from a video clip.

Voice access is another focal point of Microsoft's latest Windows 11 update, detailed in a separate blog post by Windows Commercial Product Marketing Manager Harjit Dhaliwal. Users can now use voice controls to navigate between multiple displays, aided by number and grid overlays that provide easy switching between screens.

A Copilot icon has already started appearing in the taskbar of some Windows systems. If you Google "microsoft installs copilot preview windows," Google adds these helpful suggestions.

People also ask: Why is Copilot preview on my computer?

How do I get rid of Copilot preview on Windows 10?


"Apparently there was some sort of update..." writes one Windows users. "Anyway, there is a logo at the bottom of the screen that is distracting and I'd like to get rid of it."

Lifehacker has already published an article titled "How to Hide (or Disable) Copilot in Windows 11."

"Artificial intelligence is feeling harder and harder to avoid," it begins, "but you still have options."
Programming

Stack Overflow To Charge LLM Developers For Access To Its Coding Content (theregister.com) 32

Stack Overflow has launched an API that will require all AI models trained on its coding question-and-answer content to attribute sources linking back to its posts. And it will cost money to use the site's content. From a report: "All products based on models that consume public Stack Overflow data are required to provide attribution back to the highest relevance posts that influenced the summary given by the model," it confirmed in a statement. The Overflow API is designed to act as a knowledge database to help developers build more accurate and helpful code-generation models. Google announced it was using the service to access relevant information from Stack Overflow via the API and integrate the data with its latest Gemini models, and for its cloud storage console.
Bitcoin

SBF Asks For 5-Year Prison Sentence, Calls 100-Year Recommendation 'Grotesque' (arstechnica.com) 189

An anonymous reader quotes a report from Ars Technica: Convicted FTX fraudster Sam Bankman-Fried pleaded for a lenient prison sentence in a court filing yesterday, saying that he isn't motivated by greed and "is already being punished." Bankman-Fried requested a sentence of 63 to 78 months, or 5.25 to 6.5 years. Because of "Sam's charitable works and demonstrated commitment to others, a sentence that returns Sam promptly to a productive role in society would be sufficient, but not greater than necessary, to comply with the purposes of sentencing," the court filing (PDF) said. Bankman-Fried's filing also said that he maintains his innocence and intends to appeal his convictions.

A presentence investigation report (PSR) prepared by a probation officer recommended that Bankman-Fried be sentenced to 100 years in prison, according to the filing. "That recommendation is grotesque," SBF's filing said, arguing that it is based on an erroneously calculated loss of $10 billion. The $10 billion loss asserted in the PSR is "illusory" because the "victims are poised to recover -- were always poised to recover -- a hundred cents on the dollar" in bankruptcy proceedings, SBF's filing said. The filing urged the court to "reject the PSR's barbaric proposal" of 100 years, saying that such sentences should only be for "heinous conduct" like terrorism and child sexual abuse.

The founder and ex-CEO of cryptocurrency exchange FTX, Bankman-Fried was convicted on seven charges with a combined maximum sentence of 110 years after a monthlong trial in US District Court for the Southern District of New York. The charges included wire fraud and conspiracy to commit wire fraud, securities fraud, commodities fraud, and money laundering. US government prosecutors are required to make a sentencing recommendation by March 15, and US District Judge Lewis Kaplan is scheduled to issue a sentence on March 28.

China

China Breakthrough Promises Optical Discs That Store Hundreds of Terabytes (theregister.com) 38

Optical discs that can store up to 200 TB of data could be possible with a new technology developed in China. If commercialized, it could revive optical media as an alternative to hard disk or tape for cost-effective long-term storage. The Register: Researchers at the University of Shanghai for Science and Technology (USST) and Shanghai Institute of Optics and Fine Mechanics (SIOM) say they have demonstrated that optical storage is possible up to the petabit level by using hundreds of layers, while also claiming to have broken the optical diffraction barrier limiting how close together recorded features can be.

In an article published in Nature titled "A 3D nanoscale optical disk memory with petabit capacity," the researchers detail how they developed a novel optical storage medium they call dye-doped photoresist (DDPR) with aggregation-induced emission luminogens (AIE-DDPR). When applied as a recording layer, this is claimed to outperform other optical systems and hard drives in terms of areal density -- the amount of storage per unit of area. To be specific, the researchers claim it to be 125 times that of a multi-layer optical disk based on gold nanorods, and 24 times that of the most advanced hard drives (based on data from 2022). The proposed recording and retrieval processes for this medium calls for two laser beams each. For optical writing, a 515 nm femtosecond Gaussian laser beam and a doughnut-shaped 639 nm continuous wave laser beam are focused on the recording area.

Cellphones

OnePlus Watch 2 Launches With Wear OS 4, 100-Hour Battery (9to5google.com) 14

Almost 3 years after launching the first OnePlus Watch, the Chinese smartphone company is launching a successor -- this time powered by Wear OS 4. Utilizing a "hybrid interface," the OnePlus Watch 2 is able to offer 100 hours of battery life, or just over four full days of use. 9to5Google reports: To achieve that goal, the OnePlus Watch 2 actually runs two separate operating systems. Wear OS handles things like apps and watchfaces, while a RTOS powered by a secondary chipset handles more lightweight tasks. A "smart mode" on the watch allows the watch swap back and forth between its two operating systems and two chipsets. Wear OS is powered by the Snapdragon W5 Gen 1 and it is Wear OS 4. The RTOS is powered by a BES 2700 MCU Efficiency chipset.

Switching between the two OS's is something you're likely to not even notice, OnePlus claims: "The BES2700 Efficiency Chipset runs RTOS and handles background activity and simple tasks, while the Snapdragon W5 handles more demanding tasks, like running your favorite Google apps. This optimized approach, enabled by the Wear OS hybrid interface seamlessly managing the transition between chips, means users will experience a smartwatch that effortlessly does it all while extending the time between charges."

Powering the Watch 2 is a 500 mAh battery which features 7.5W charging with a special charger that connects to a typical USB-C cable. The charger is magnetic, of course, and OnePlus claims a full charge in 60 minutes or less. The 1.43-inch AMOLED display of the OnePlus Watch 2 is covered in a slightly curved sapphire glass, while the watch chassis is built from stainless steel. You'll have the choice of black or silver colors with either black or green bands, respectively. The whole package is also 5ATM water resistant. Rounding out the main specs you'll find 32GB of storage and 2GB of RAM.
The OnePlus Watch 2 goes on sale today at $299.

Slashdot Top Deals