×
Microsoft

Security Threat Analyst Accuses Microsoft of Hosting Malware on Office365's OneDrive (itwire.com) 48

Slashdot reader juul_advocate quotes ITWire: A British tech researcher, who quit working as a security threat analyst with Microsoft a few months back, has called on his former employer to act speedily to remove links to ransomware on its Office365 platform. In a tweet sent on Friday, Beaumont said: "Microsoft cannot advertise themselves as the security leader with 8,000 security employees and trillions of signals if they cannot prevent their own Office365 platform being directly used to launch Conti ransomware. OneDrive abuse has been going on for years. Fix it...."

An overwhelming majority of ransomware attacks only Windows, with an analysis by staff of the Google-owned VirusTotal database last Thursday showing that 95% of 80 million samples analysed — all the way back to January 2020 — were aimed at Windows... Beaumont, who has a well-earned reputation as a researcher who is quick to admit faults in his own industry, acknowledged that other technology companies also played a big role in hosting malware. Quoting a tweet from a Swiss researcher [given below], he said: "And yes, it's not just Microsoft. Tech companies have got to do better."

Businesses

All-Remote GitLab Valued at $15B in NASDAQ's First-Ever Livestreamed IPO Day (forbes.com) 18

"Long before the pandemic, software business GitLab operated fully remotely, building its developer tools without any physical office..." remembers Forbes.

"The company went public on Thursday on Nasdaq under the ticker 'GTLB.' Priced at $77, shares of GitLab closed their first day of trading at $103.89, up 35%, giving GitLab a market cap of nearly $15 billion." In an interview, CEO Sid Sijbrandij (pronounced "see brandy") said that going public would help GitLab to remain a well-resourced, long-standing steward of the open-source project on top of which its business software is built. "This had to happen sometime," Sijbrandij says. "We knew we were ready, the markets were ready, so why not take the step today?" At its closing price on Thursday, GitLab's IPO has made Sijbrandij a new tech billionaire, with an equity stake valued at $2.3 billion; in addition, he also sold about $150 million worth of Gitlab shares as part of the company's offering.

With revenue of $58 million in its previous quarter, up 69% year over year, on losses of $40 million, GitLab fits the mold of a classic high-growth, unprofitable business-to-business software provider — cloud players that have in recent years proven popular, and able to command high multiples, with Wall Street. While its losses have narrowed recently, GitLab still generates about $1.50 in new business for each $1 spent by customers previously on its tools, putting it in elite company in the category...

The company was the first-ever on Nasdaq to livestream its entire IPO day, with about 18,000 people stopping by over the course of the broadcast, it says.... Sijbrandij also became known as one of remote work's leading evangelists, advocating a no-hybrid, radically transparent office culture he says is fairer and more productive. That purist view remains a hiring advantage, he tells Forbes now; it also helps explain the livestream, part marketing opportunity and part way to include GitLab employees and advocates across the globe. "It's always behind closed doors," Sijbrandij says of the IPO festivities and listing process. Originally from the Netherlands, Sijbrandij added his parents were two of the viewers. "It was awesome to share it with the world."

China

Hundreds of Banned Crypto Miners Were Siphoning Power at China's State Firms (bloomberg.com) 41

China's drive to root out cryptocurrencies has uncovered hundreds of miners who were using electricity at public institutions, a development that comes as the nation struggles with a power crunch. From a report: Zhejiang and Jiangsu provinces recently started targeting miners who were consuming the resources of state-owned enterprises, government agencies, and universities and research institutes, according to a government statement and media reports that did not name the entities. Jiangsu found about one-fifth of some 4,500 internet protocol addresses associated with illegal mining activity belonged to public institutions, according to the media outlet The Paper, which cited provincial communications authorities. Some 260,000 kilowatt hours of electricity were being used per day, the newspaper added. Cryptominers typically link their equipment to cloud services called mining pools to verify transactions on blockchains, allowing their physical locations to be traced. That would lead investigators to accounts with electric companies.

The Zhejiang government published a statement on an official social account that included photos of equipment seized in raids, adding that 184 IP addresses were suspected of involvement in illegal mining exploiting public resources. "The rapid upgrading of mining hardware and fierce competition in computing power have resulted in massive energy usage, which is contrary to the carbon peak and carbon neutralization goals of the whole province as a major energy importer," the statement said.

Cloud

Cisco Wants To Climb Back the Way Microsoft Did (bloomberg.com) 61

The networking giant says it has turned a corner in its attempt to adapt to the cloud era. From a report: Cisco is hardly a failure. It produces billions of dollars in annual profits and is generally regarded as stable and well-run. But investors feared that its steady operations could lead to a slow-motion descent into obsolescence in an industry that can be brutal to anyone who falls a half-step behind. The best example of a tech giant stumbling then regaining its dominance is probably Microsoft, and analysts regularly hold it up as a role model for Cisco. Microsoft's decline, which began about the same time as Cisco's, was largely the result of a progression of disappointing products. That began to change in 2014, when new Chief Executive Officer Satya Nadella started selling tons of copies of popular software such as Excel and Word as subscription services rather than one-time purchase products and built a formidable cloud computing division. Microsoft is now the only U.S. company other than Apple with a market value of more than $2 trillion.

Chuck Robbins has held his job as Cisco's CEO just one year less than Nadella. In recent months, he's begun to insist that his company has finally reached its inflection point. Cisco acknowledged years ago that it had failed to capitalize on the chance to build the initial infrastructure for cloud computing, says Robbins, and responded with a significant, if slow-developing, overhaul of its strategy. "We were going to build technology for the next transition," he says. "We did that. Now we're seeing the benefit." Cisco's initial problem was partially a lack of flexibility. When Amazon, Google, and Microsoft began building cloud computing data centers, they wanted components, software, and machines that were tailored to their needs. Cisco insisted on selling the same expensive, uncustomizable equipment that was always the core of its business. The burgeoning cloud companies were only too happy to take their business elsewhere. Robbins can point to significant changes during his six-year tenure. Cisco has made a string of acquisitions that have turned it into one of the top 10 software companies in the world by revenue. Software and services have surpassed hardware and now make up more than half of Cisco's revenue. Its expected future revenue for outstanding fees from these products totals $30 billion.

Cloud

Google Cloud Will Now Show Its Users Their Carbon Footprint In the Cloud (techcrunch.com) 41

Google Cloud today announced a new (and free) feature that will provide its users with custom carbon footprint reports that detail the carbon emissions their cloud usage generates. TechCrunch reports: "Customers can leverage this data for reporting as well as internal audits and carbon reduction efforts. Build in collaboration with customers like HSBC, L'Oreal and Atos, our carbon footprint reporting introduces a new level of transparency to support customers in meeting their climate goals," said Jenn Bennett, who leads Google Cloud's data and technology strategy for sustainability in the Office of the CTO. "Customers can monitor their cloud emissions over time by project, by product and by region, empowering IT teams and developers with metrics that help them reduce their carbon footprint. Digital infrastructure emissions are really just one part of their environmental footprint, but accounting for carbon emissions is necessary to measure progress against the carbon reduction targets that they all have."

As Bennett noted, once a company has accurate reporting in place, providing recommendations for how to reduce their climate impact is a natural next step. Specifically, this means adding carbon estimates to Google Cloud's Unattended Project Recommender, which helps customers reduce their number of idling resources, and adding a sustainability impact category to its Active Assist Recommender.

AI

Microsoft's Project Turing is Building AI to Rival Google and Open AI (businessinsider.com) 11

An anonymous reader shares a report: Since 2017, Microsoft has pursued this goal under the name Project Turing, a team that's tasked with building these large language models and figuring out how they can be used in the company's vast suite of products. Project Turing might not be a visible name outside the company, its AI can already be found generating text inside Microsoft Office products and powering much of the curated information provided when searching with Bing. If Turing succeeds, the strategy could amplify the research dollars that Microsoft has poured into AI research over previous decades. Notably, Microsoft isn't only using Turing-NLG, the project's flagship model, internally: It's already begun selling the tech to select partners, hinting at the cloud giant's ambitions for the AI market. Insider spoke with AvePoint and Volume.ai, both of whom are using Turing in their own products.

"Our job is to further the frontier of AI innovation as much as possible," Ali Alvi, group program manager of Project Turing, told Insider. Alvi tells Insider that the Turing team was assembled from within the company by Microsoft chief technology officer Kevin Scott, in recognition of the ongoing deep learning boom. Scott encouraged the team to think bigger and work with the Azure infrastructure team to make the models exponentially larger. When CEO Satya Nadella saw the team's progress, he decided to get it into the hands of customers, Alvi says. AvePoint, a Microsoft partner that resells and builds applications on top of Microsoft products, has launched two products so far using the Turing model: An education platform for teachers that will automatically create quiz questions using material that's been uploaded for a specific course, and a corporate training platform that uses Turing to test employees on internal material.

Microsoft

Microsoft Says It Mitigated a 2.4 Tbps DDoS Attack, the Largest Ever (therecord.media) 39

Microsoft said its Azure cloud service mitigated a 2.4 terabytes per second (Tbps) distributed denial of service attack this year, at the end of August, representing the largest DDoS attack recorded to date. From a report: Amir Dahan, Senior Program Manager for Azure Networking, said the attack was carried out using a botnet of approximately 70,000 bots primarily located across the Asia-Pacific region, such as Malaysia, Vietnam, Taiwan, Japan, and China, as well as the United States. Dahan identified the target of the attack only as "an Azure customer in Europe."

The Microsoft exec said the record-breaking DDoS attack came in three short waves, in the span of ten minutes, with the first at 2.4 Tbps, the second at 0.55 Tbps, and the third at 1.7 Tbps. Dahan said Microsoft successfully mitigated the attack without Azure going down. Prior to Microsoft's disclosure today, the previous DDoS record was held by a 2.3 Tbps attack that Amazon's AWS division mitigated in February 2020.

Google

Google Unveils Cybersecurity Programs and Action Team (venturebeat.com) 21

An anonymous reader shares a report: By the end of 2021, cybercrime is expected to cost the world $6 trillion. And by 2025, this figure will climb to $10.5 trillion, according to Cybersecurity Ventures. There's been a rash of recent high-profile cyberattacks, including Colonial Pipeline, the SolarWinds breach, and JBS USA. That's perhaps why 80% of senior IT employees believe that their companies lack sufficient protection against cyberattacks, despite increased security investments made in 2020.

To address the challenges, Google today at Google Cloud Next 2021 debuted Work Safer, a program to help organizations, employees, and partners collaborate in hybrid work environments. It also unveiled a new security-focused task force --- the Cybersecurity Action Team -- and a security and resilience framework, in addition to enhanced security capabilities in Workspace. The announcements come after research showing that companies want cloud providers to increase their security efforts. According to a a recent Tripwire survey, while the majority of enterprises believe that public cloud providers are doing enough to ensure security for users, it's "just barely adequate."

Science

Can Nuclear Fusion Put the Brakes on Climate Change? (newyorker.com) 292

Amid an escalating crisis, the power source offers a dream -- or a pipe dream -- of limitless clean energy. From a report: Let's say that you've devoted your entire adult life to developing a carbon-free way to power a household for a year on the fuel of a single glass of water, and that you've had moments, even years, when you were pretty sure you would succeed. Let's say also that you're not crazy. This is a reasonable description of many of the physicists working in the field of nuclear fusion. In order to reach this goal, they had to find a way to heat matter to temperatures hotter than the center of the sun, so hot that atoms essentially melt into a cloud of charged particles known as plasma; they did that. They had to conceive of and build containers that could hold those plasmas; they did that, too, by making "bottles" out of strong magnetic fields. When those magnetic bottles leaked -- because, as one scientist explained, trying to contain plasma in a magnetic bottle is like trying to wrap a jelly in twine -- they had to devise further ingenious solutions, and, again and again, they did. Over decades, in the pursuit of nuclear fusion, scientists and engineers built giant metal doughnuts and Gehryesque twisted coils, they "pinched" plasmas with lasers, and they constructed fusion devices in garages. For thirty-six years, they have been planning and building an experimental fusion device in Provence. And yet commercially viable nuclear-fusion energy has always remained just a bit farther on.

As the White Queen, in "Through the Looking Glass," said to Alice, it is never jam today, it is always jam tomorrow. The accelerating climate crisis makes fusion's elusiveness more than cutely maddening. Solar energy gets more efficient and affordable each year, but it's not continuously available, and it still relies on gas power plants for distribution. The same is true for wind power. Conventional nuclear power has extremely well-known disadvantages. Carbon capture, which is like a toothbrush for the sky, is compelling, but after you capture a teraton or two of carbon there's nowhere to put it. All these tools figure extensively in decarbonization plans laid out by groups like the Intergovernmental Panel on Climate Change, but, according to those plans, even when combined with one another the tools are insufficient. Fusion remains the great clean-energy dream -- or, depending on whom you ask, pipe dream. Fusion, theoretically, has no scarcity issues; our planet has enough of fusion's primary fuels, heavy hydrogen and lithium, which are found in seawater, to last thirty million years.

Fusion requires no major advances in batteries, it would be available on demand, it wouldn't cause the next Fukushima, and it wouldn't be too pricey -- if only we could figure out all the "details." (A joke I heard is that fusion operates according to the law of the "conservation of difficulty": when one problem is solved, a new one of equal difficulty emerges to take its place.) The details are tremendously complex, and the people who work to figure them out have for years been dealing with their own scarcities -- scarcities of funding and scarcities of faith. Fusion, as of now, has no place in the Green New Deal. In 1976, the U.S. Energy Research and Development Administration published a study predicting how quickly nuclear fusion could become a reality, depending on how much money was invested in the field. For around nine billion a year in today's dollars -- described as the "Maximum Effective Effort" -- it projected reaching fusion energy by 1990. The scale descended to about a billion dollars a year, which the study projected would lead to "Fusion Never." "And that's about what's been spent," the British physicist Steven Cowley told me. "Pretty close to the maximum amount you could spend in order to never get there."

Cloud

Is It Time to Stop Paying For a VPN? (bdnews24.com) 113

"I'm done with paying for a virtual private network," writes the New York Times' lead consumer technology writer. [Alternate URLs here and here.] The reality is that web security has improved so much in the last few years that VPN services, which charge monthly subscription fees that cost as much as Netflix, offer superfluous protection for most people concerned about privacy, some security researchers said.

Many of the most popular VPN services are now also less trustworthy than in the past because they have been bought by larger companies with shady track records. That's a deal-breaker when it comes to using a VPN service, which intercepts our internet traffic. If you can't trust a product that claims to protect your privacy, what good is it? "Trusting these people is really critical," Matthew Green, a computer scientist who studies encryption, said about VPN providers. "There's no good way to know what they're doing with your data, which they have huge amounts of control over...."

As a mainstream privacy tool, it's no longer an ideal solution. This sent me down a rabbit hole of seeking alternatives to paying for a VPN. I ended up using some web tools to create my own private network [on the cloud] for free, which wasn't easy... Not only is it free to use, but I no longer have to worry about trust because the operator of the technology is me.

"But I also learned that many casual users may not even need a VPN anymore," the article concludes. (Unless you're living in an authoritarian country and trying to reach information beyond its firewall.) One cybersecurity firm tells the Times that journalists with sensitive contacts or business executives carrying trade secrets might also still benefit from a VPN. But (according to the firm) the rest of us can just try two-factor authentication and keeping all of our software up-to-date. (And if you'd rather not use a public wifi network — use your phone as a mobile hot spot.)

The article also notes that 95% of the top 1,000 websites are now already encrypted with HTTPS, according to W3Techs.

It also points out that one VPN company accused of developing malware nonetheless spent close to a billion dollars to buy at least four other VPN services — and then also bought several VPN review sites, which then give top ratings to VPN services it owns...
XBox (Games)

Xbox's Cloud Streaming Upgrade Means You Might Not Need A Series X (kotaku.com) 16

Speaking to The Verge, representatives for Microsoft confirmed that Xbox cloud gaming now runs entirely on Xbox Series X hardware, wrapping up a backend upgrade that quietly kicked off over the summer. Kotaku reports: So, what's that mean? For one, the Xbox Series X can output games in 4K resolution. Though cloud gaming currently streams games at 60 frames per second, the service only outputs games at a resolution of 1080p. Moving the whole operation to an Xbox Series X framework could theoretically bring that in line with what's available on Microsoft's highest-end model. But the switch could also open up access to games that are otherwise inaccessible to gamers gated out of next-gen hardware. Obviously, it's still tough as hell to get your hands on an Xbox Series X these days (and, to a lesser extent, its computationally scrawnier cousin, the Xbox Series S). Don't expect it to get any easier any time soon. Just last week, Xbox boss Phil Spencer said as much.
Apple

Epic Games CEO Tim Sweeney Calls Out Apple for Promoting Its Services in the iPhone Settings Screen (techcrunch.com) 59

Epic Games CEO Tim Sweeney, whose high-profile antitrust lawsuit against Apple is now under appeal, is today calling out the iPhone maker for giving itself access to an advertising slot its competitors don't have: the iPhone's Settings screen. From a report: Some iOS 15 users noticed Apple is now advertising its own services at the top of their Settings, just below their Apple ID. The services being suggested are personalized to the device owner, based on which ones they already subscribe to, it appears. For example, those without an Apple Music subscription may see an ad offering a free six-month trial. However, current Apple Music subscribers may instead see a prompt to add on a service they don't yet have, like AppleCare coverage for their devices.

Sweeney suggests this sort of first-party advertising is an anticompetitive risk for Apple, as some of the services it's pushing here are those that directly compete with third-party apps published on its App Store. But those third-party apps can't gain access to the iPhone's Settings screen, of course --- they can only bid for ad slots within the App Store itself. Writes Sweeney: "New from the guys who banned Fortnite: settings-screen ads for their own music service, which come before the actual settings, and which aren't available to other advertisers like Spotify or Sound Cloud."

Hardware

D-Wave Announces New Hardware, Compiler, and Plans For Quantum Computing (arstechnica.com) 23

On Tuesday, D-Wave released its roadmap for upcoming processors and software for its quantum annealers. The company is also announcing that it's going to be developing its own gate-based hardware, which it will offer in parallel with the quantum annealer. Ars Technica's John Timmer talked with company CEO Alan Baratz to understand all the announcements. An anonymous reader shares an excerpt from the report: The simplest part of the announcement to understand is what's happening with D-Wave's quantum-annealing processor. The current processor, called Advantage, has 5,000 qubits and 40,000 connections among them. These connections play a major role in the chip's performance as, if a direct connection between two qubits can't be established, others have to be used to act as a bridge, resulting in a lower effective qubit count. Starting this week, users of D-Wave's cloud service will have access to an updated version of Advantage. The qubit and connection stats will remain the same, but the device will be less influenced by noise in the system (in technical terms, its qubits will maintain their coherence longer). [...] Further out in the future is the follow-on system, Advantage 2, which is expected late next year or the year after. This will see another boost to the qubit count, going up to somewhere above 7,000. But the connectivity would go up considerably as well, with D-Wave targeting 20 connections per qubit.

D-Wave provides a set of developer tools it calls Ocean. In previous iterations, Ocean has allowed people to step back from directly controlling the hardware; instead, if a problem could be expressed as a quadratic unconstrained binary optimization (QUBO), Ocean could produce the commands needed to handle all the hardware configuration and run the problem on the optimizer. D-Wave referred to this as a hybrid problem solver, since Ocean would use classical computing to optimize the QUBO prior to execution. The only problem is that not everyone who might be interested in trying D-Wave hardware knows how to express their problem as a QUBO. So, the new version of Ocean will allow an additional layer of abstraction by allowing problems to be sent to the system in the format typically used by people who tend to solve these sorts of problems. "You will now be able to specify problems in the language that data scientists and data analysts understand," Baratz promised.

The biggest part of today's announcement, however, may be that D-Wave intends to also build gate-based hardware. Baratz explained that he thinks that optimization is likely to remain a valid approach, pointing to a draft publication that shows that structuring some optimization problems for gate-based hardware may be so computationally expensive that it would offset any gains the quantum hardware could provide. But it's also clear that gate-based hardware can solve an array of problems that a quantum annealer can't. He also argued that D-Wave has solved a number of problems that are currently limiting advances in gate-based hardware that uses electronic qubits called transmons. These include the amount and size of the hardware that's needed to send control signals to the qubits and the ability to pack qubits in densely enough so that they're easy to connect but not close enough that they start to interfere with each other. One of the problems D-Wave faces, however, is that the qubits it uses for its annealer aren't useful for gate-based systems. While they're based on the same bit of hardware (the Josephson junction), the annealer's qubits can only be set as up or down. A gate-based qubit needs to allow manipulations in three dimensions. So, the company is going to try building flux qubits, which also rely on Josephson junctions but use them in a different way. So, at least some of the company's engineering expertise should still apply.

Oracle

Oracle Appeal Over JEDI Contract Turned Away by Supreme Court (bloomberg.com) 14

The U.S. Supreme Court turned away a lingering appeal by Oracle stemming from its challenge to the now-scrapped $10 billion cloud-computing contract the Pentagon awarded to Microsoft in 2019. From a report: The rejection was a formality given the Defense Department's decision in July to drop the contract and divide the work among multiple bidders, potentially between Microsoft and Amazon. Oracle's appeal centered on alleged conflicts of interest involving Amazon, and on claims that the Pentagon violated its own rules when it set up the contract to be awarded to a single firm.
Security

Millions Experience Browser Problems After Long-Anticipated Expiration of 'Let's Encrypt' Certificate (zdnet.com) 94

"The expiration of a key digital encryption service on Thursday sent major tech companies nationwide scrambling to deal with internet outages that affected millions of online users," reports the Washington Examiner.

The expiring certificate was issued by Let's Encrypt — though ZDNet notes there's been lots of warnings about its pending expiration: Digital Shadows senior cyber threat analyst Sean Nikkel told ZDNet that Let's Encrypt put everyone on notice back in May about the expiration of the Root CA Thursday and offered alternatives and workarounds to ensure that devices would not be affected during the changeover. They have also kept a running forum thread open on this issue with fairly quick responses, Nikkel added.
Thursday night the Washington Examiner describes what happened when the big day arrived: Tech giants — such as Amazon, Google, Microsoft, and Cisco, as well as many smaller tech companies — were still battling with an endless array of issues by the end of the night... At least 2 million people have seen an error message on their phones, computers, or smart gadgets in the past 24 hours detailing some internet connectivity problems due to the certificate issue, according to Scott Helme, an internet security researcher and well-known cybersecurity expert. "So many people have been affected, even if it's only the inconvenience of not being able to visit certain websites or some of their apps not working," Helme said.

"This issue has been going on for many hours, and some companies are only just getting around to fixing it, even big companies with a lot of resources. It's clearly not going smoothly," he added.

There was an expectation before the certificate expired, Helme said, that the problem would be limited to gadgets and devices bought before 2017 that use the Let's Encrypt digital certificate and haven't updated their software. However, many users faced issues on Thursday despite having the most cutting-edge devices and software on hand. Dozens of major tech products and services have been significantly affected by the certificate expiration, such as cloud computing services for Amazon, Google, and Microsoft; IT and cloud security services for Cisco; sellers unable to log in on Shopify; games on RocketLeague; and workflows on Monday.com.

Security researcher Scott Helme also told ZDNet he'd also confirmed issues at many other companies, including Guardian Firewall, Auth0, QuickBooks, and Heroku — but there might be many more beyond that: "For the affected companies, it's not like everything is down, but they're certainly having service issues and have incidents open with staff working to resolve. In many ways, I've been talking about this for over a year since it last happened, but it's a difficult problem to identify. it's like looking for something that could cause a fire: it's really obvious when you can see the smoke...!"

Digital certificates expert Tim Callan added that the popularity of DevOps-friendly architectures like containerization, virtualization and cloud has greatly increased the number of certificates the enterprise needs while radically decreasing their average lifespan. "That means many more expiration events, much more administration time required, and greatly increased risk of a failed renewal," he said.

Cloud

Alliance Including Amazon, Google, Microsoft, and IBM Vows to Protect Rights and Privacy With 'Trusted Cloud Principles' (zdnet.com) 33

ZDNet reports: Some of the world's largest tech giants — Amazon, Google, Microsoft, IBM, Salesforce/Slack, Atlassian, SAP, and Cisco — have joined forces to establish the Trusted Cloud Principles in what they are claiming is their commitment to protecting the rights of their customers... Some of the specific principles that have been founded by the signatories include governments should seek data directly from enterprise customers first, rather than cloud providers, other than in "exceptional circumstances"; customers should have a right to notice when governments seek to access customer data directly from cloud service providers; and there should be a clear process for cloud providers to challenge government access requests for customers' data, including notifying relevant data protection authorities, to protect customers' interests.

Also outlined in the principles is the point that governments should create mechanisms to raise and resolve conflicts with each other such that cloud service providers' legal compliance in one country does not amount to a violation of law in another; and governments should support cross-border data flows. At the same time, the cloud service providers acknowledge that under the principles they recognise international human rights law enshrines a right to privacy, and the importance of customer trust and customers' control and security of their data. The signatories also said they commit to supporting laws that allow governments to request data through a transparent process that abides by human right standards; international legal frameworks to resolve conflicting laws related to data access, privacy, and sovereignty; and improved rules and regulations at the national and international levels that protect the safety, privacy, and security of cloud customers and their ownership of data...

The Trusted Cloud Principles come days after a separate data cloud framework was stood up between Amazon Web Services, Google, IBM, Microsoft and other major tech giants, plus the EDM Council, a cross-industry trade association for data management and analytics. Under the Cloud Data Management Capabilities (CDMC) framework there are six components, 14 capabilities, and 37 sub-capabilities that sets out cloud data management capabilities, standards, and best practices for cloud, multi-cloud, and hybrid-cloud implementations while also incorporating automated key controls for protecting sensitive data.

Businesses

Zoom and Five9 Abandon $14.7 Billion Acquisition (cnbc.com) 5

Cloud contact center software company Five9 and video calling software maker Zoom said Thursday they will not go forward with Zoom's plan to acquire Five9 for $14.7 billion. From a report: Five9 shares fell 2% in extended trading following the statement from the companies, which said the acquisition didn't receive enough votes from Five9 shareholders. A branch of the U.S. Department of Justice was reviewing the deal out of concern of potential foreign participation, according to a letter dated Aug. 27, that was sent to the Federal Communications Commission. But Zoom said last week, when news of the review was reported, that it still expected the deal to close in the first half of 2022.
Data Storage

Cloudflare To Enter Infrastructure Services Market With New R2 Storage Product (techcrunch.com) 19

Cloudflare, which has a network of data centers in 250 locations around the world, announced its first dalliance with infrastructure services today, an upcoming cloud storage offering called R2. From a report: Company co-founder and CEO Matthew Prince says that the idea for moving into storage as a service came from the same place as other ideas the company has turned into products. It was something they needed in-house and that led to them building it for themselves, before offering it to customers too. "When we build products, the reason that we end up building them is usually because we need them ourselves," Prince told me. He said that the storage component grew out of the need to store object components like images on the company's network. Once they built it, and they looked around at the cloud storage landscape, they decided that it would make sense to offer it as a product to customers too. [...] The R2 name is a little swipe at Amazon's S3 storage product and obviously a play on the name. The difference, according to Prince, is that they have found a way to reduce storage costs by up to 10% by eliminating egress fees. Cloudflare plans to price storage at $0.015 per GB of data stored per month. That compares with S3 pricing that starts at $0.023 per GB for the first 50 TB per month. Ben Thompson, writing at Stratechery: The reason that Cloudflare can pull this off is the same reason why S3's margins are so extraordinary: bandwidth is a fixed cost, not a marginal one. To take the most simplified example possible, if I were to have two computers connected by a cable, the cost of bandwidth is however much I paid for the cable; once connected I can transmit as much data I would like for free -- in either direction.

That's not quite right, of course: I am constrained by the capacity of the cable; to support more data transfer I would have to install a higher capacity cable, or more of them. What, though, if I already had built a worldwide network of cables for my initial core business of protecting websites from distributed denial-of-service attacks and offering a content delivery network, the value of which was such that ISPs everywhere gave me space in their facilities to place my servers? Well, then I would have massive amounts of bandwidth already in place, the use of which has zero marginal costs, and oh-by-the-way locations close to end users to stick a whole bunch of hard drives.

In other words, I would be Cloudflare: I would charge marginal rates for my actual marginal costs (storage, and some as-yet-undetermined-but-promised-to-be-lower-than-S3 rate for operations), and give away my zero marginal cost product for free. S3's margin is R2's opportunity.

IOS

iOS 15 Messages Bug Causes Saved Photos to Be Deleted (macrumors.com) 37

A serious bug in the iOS 15 Messages app can cause some saved photos to be deleted, according to multiple complaints reported by MacRumors readers and Twitter users. From the report: If you save a photo from a Messages thread and then go on to delete that thread, the next time an iCloud Backup is performed, the photo will disappear. Even though the image is saved to your personal iCloud Photo Library, it appears to still be linked to the Messages app in "iOS 15," and saving it does not persist through the deletion of the thread and an "iCloud" backup. This is a concern because most users keep the "iCloud" Backup feature enabled and it's something that happens automatically. If you're someone who regularly deletes message threads, if there's a photo that you want to keep, you won't be able to keep it with "iCloud" Backup turned on.

To replicate this bug, the following steps must be taken:
1. Save a photo from a Messages conversation to your Camera Roll.
2. Check to see that the photo has been saved.
3. Delete the Messages conversation the photo came from. The photo will still be in your "iCloud Photo Library" at this point.
4. Perform an "iCloud" Backup, and the photo disappears.

Wireless Networking

Ring Puts An Eero Router Inside Its New Home Alarm System (engadget.com) 28

Eero and Ring -- two Amazon-owned companies -- have teamed up to produce a home security system that incorporates an Eero router inside. Engadget reports: Ring COO Mike Harris said that the decision to work with Eero was not one foisted down from upon high by Amazon. Instead, Harris said that both companies saw the opportunity to work together to help leverage their individual skills in tandem. To take advantage of the technology, you'll need to sign up to Ring's new subscription product, dubbed Protect Pro. The package offers cloud video storage, professional monitoring, Alexa Guard Plus, 24/7 backup internet for your security devices (via an LTE module in the Ring Pro base station) and Eero's cybersecurity subscription product for network protection. This, at least in the US as it launches, will set you back $20 a month, or $200 per year per location up front.

At the same time, Ring is launching a system dubbed "Virtual Security Guard," which connects users to third-party security guards. You'll need to pay for that separately, but you can hand over access to select Ring camera feeds to those companies who can keep a watch over your property. It is only when motion is detected that an operator can access your feed, and can speak to whoever is there to determine their intentions. Ring adds that third parties can't view motion events when the camera is disarmed, and can't download, share or save the clips of what's going on in your front yard. The first company to sign up for the program is Rapid Response, with others expected to join in the near future.
The Virtual Security Guard service will require you to apply for early access, but the Ring Alarm Pro can be pre-ordered today for $250. (Since this isn't a Slashvertisement, we won't include a link to the product; you'll have to search for it yourself. Sorry not sorry.)

Slashdot Top Deals