×
Hardware

US Focuses on Invigorating 'Chiplet' Production in the US (nytimes.com) 19

More than a decade ago engineers at AMD "began toying with a radical idea," remembers the New York Times. Instead of designing one big microprocessor, they "conceived of creating one from smaller chips that would be packaged tightly together to work like one electronic brain."

But with "diminishing returns" from Moore's Law, packaging smaller chips suddenly becomes more important. [Alternate URL here.] As much as 80% of microprocessors will be using these designs by 2027, according to an estimate from the market research firm Yole Group cited by the Times: The concept, sometimes called chiplets, caught on in a big way, with AMD, Apple, Amazon, Tesla, IBM and Intel introducing such products. Chiplets rapidly gained traction because smaller chips are cheaper to make, while bundles of them can top the performance of any single slice of silicon. The strategy, based on advanced packaging technology, has since become an essential tool to enabling progress in semiconductors. And it represents one of the biggest shifts in years for an industry that drives innovations in fields like artificial intelligence, self-driving cars and military hardware. "Packaging is where the action is going to be," said Subramanian Iyer, a professor of electrical and computer engineering at the University of California, Los Angeles, who helped pioneer the chiplet concept. "It's happening because there is actually no other way."

The catch is that such packaging, like making chips themselves, is overwhelmingly dominated by companies in Asia. Although the United States accounts for around 12 percent of global semiconductor production, American companies provide just 3 percent of chip packaging, according to IPC, a trade association. That issue has now landed chiplets in the middle of U.S. industrial policymaking. The CHIPS Act, a $52 billion subsidy package that passed last summer, was seen as President Biden's move to reinvigorate domestic chip making by providing money to build more sophisticated factories called "fabs." But part of it was also aimed at stoking advanced packaging factories in the United States to capture more of that essential process... The Commerce Department is now accepting applications for manufacturing grants from the CHIPS Act, including for chip packaging factories. It is also allocating funding to a research program specifically on advanced packaging...

Some chip packaging companies are moving quickly for the funding. One is Integra Technologies in Wichita, Kan., which announced plans for a $1.8 billion expansion there but said that was contingent on receiving federal subsidies. Amkor Technology, an Arizona packaging service that has most of its operations in Asia, also said it was talking to customers and government officials about a U.S. production presence... Packaging services still need others to supply the substrates that chiplets require to connect to circuit boards and one another... But the United States has no major makers of those substrates, which are primarily produced in Asia and evolved from technologies used in manufacturing circuit boards. Many U.S. companies have also left that business, another worry that industry groups hope will spur federal funding to help board suppliers start making substrates.

In March, Mr. Biden issued a determination that advanced packaging and domestic circuit board production were essential for national security, and announced $50 million in Defense Production Act funding for American and Canadian companies in those fields. Even with such subsidies, assembling all the elements required to reduce U.S. dependence on Asian companies "is a huge challenge," said Andreas Olofsson, who ran a Defense Department research effort in the field before founding a packaging start-up called Zero ASIC. "You don't have suppliers. You don't have a work force. You don't have equipment. You have to sort of start from scratch."

Open Source

Despite Layoffs, Open Source and Linux Skills are Still in Demand (zdnet.com) 36

ZDNet reports that Jim Zemlin, executive director at the Linux Foundation, recently noted rounds of tech-industry layoffs "in the name of cost-cutting." But then Zemlin added that "open source is countercyclical to these trends. The Linux Foundation itself, for instance, had its best first quarter ever."

As Hilary Carter, SVP of research and communications at the Linux Foundation, said in her keynote speech at Open Source Summit North America in Vancouver, Canada: "In spite of what the headlines are saying, the facts are 57% of organizations are adding workers this year." Carter was quoting figures from the Linux Foundation's latest job survey, which was released at the event.

Other research also points to brighter signs in tech employment trends. CompTIA's recent analysis of the latest Bureau of Labor Statistics (BLS) data suggests the tech unemployment rate climbed by just 2.3% in April. In fact, more organizations plan to increase their technical staff levels rather than decrease.

The demand for skilled tech talent remains strong, particularly in fast-developing areas, such as cloud and containers, cybersecurity, and artificial intelligence and machine learning. So, what do these all areas of technology have in common? The answer is they're all heavily dependent on open source and Linux technologies.

While layoffs are happening at Microsoft, Amazon, Google, IBM, and even Red Hat, "the Linux Foundation found senior technical roles are seeing the biggest cuts," the article points out. "New hiring is focused on developers and IT managers." And companies are also spending more on training for existing technical staff, "driven by the fact that there aren't enough experts in hot technologies, such as Kubernetes and generative AI, to go around." Interestingly, a college degree is no longer seen as such a huge benefit. Businesses responding to the Linux Foundation's research felt upskilling (91%) and certifications (77%) are more important than a university education (58%) when it comes to addressing technology needs.
Science

Qbits 30 Meters Apart Maintain Entanglement Across Refrigeration Systems (arstechnica.com) 40

"A new experiment uses superconducting qubits to demonstrate that quantum mechanics violates what's called local realism," reports Ars Technica, "by allowing two objects to behave as a single quantum system no matter how large the separation between them." The experiment wasn't the first to show that local realism isn't how the Universe works — it's not even the first to do so with qubits. But it's the first to separate the qubits by enough distance to ensure that light isn't fast enough to travel between them while measurements are made. And it did so by cooling a 30-meter-long aluminum wire to just a few milliKelvin. Because the qubits are so easy to control, the experiment provides a new precision to these sorts of measurements.

And the hardware setup may be essential for future quantum computing efforts... Everyone working with superconducting qubits says that we will ultimately need to integrate thousands of them into a single quantum computer. Unfortunately, each of these qubits requires a considerable amount of space on a chip, meaning it gets difficult to make chips with more than a few hundred of them. So major players like Google and IBM ultimately plan to link multiple chips into a single computer (something the startup Rigetti is already doing).

For tens of thousands of qubits, however, we're almost certainly going to need so many chips that it gets difficult to keep them all in a single bit of cooling hardware. This means we're going to eventually want to link chips in different refrigeration systems — exactly what was demonstrated here. So this is an important demonstration that we can, in fact, link qubits across these sorts of systems.

Or, as long-time slashdot reader nounderscores puts it, "Imagine a beowulf cluster of these.

"The Qbits that Simon Storz et al at ETH Zurich entangled at the ends of 30m of cryogenically chilled wire not only put the last nail into the coffin of hidden variable theory by being so far apart, they also allow quantum computing to scale to multiple refrigeration systems."
AI

IBM Unveils New Watsonx, AI and Data Platform (reuters.com) 22

IBM on Tuesday launched watsonx, a new artificial intelligence and data platform to help companies integrate AI in their business. From a report: The new AI platform launch comes over a decade after IBM's software called Watson got attention for winning the game show Jeopardy. IBM at the time said Watson could "learn" and process human language. But Watson's high cost at the time made it a challenge for companies to use, according to Reuters reporting. Fast forward a decade, chatbot ChatGPT's overnight success is making AI adoption at companies a focus, and IBM is looking to grab new business. This time, the lower cost of implementing the large language AI models means the chances of success are high, IBM CEO Arvind Krishna told Reuters ahead of the company's annual Think conference.

"When something becomes 100 times cheaper, it really sets up an attraction that's very, very different," said Krishna. "The first barrier to create the model is high, but once you've done that, to adapt that model for a hundred or a thousand different tasks is very easy and can be done by a non-expert." Krishna said AI could reduce certain back office jobs at IBM in the coming years. "That doesn't mean the total employment decreases," he said about some media reports talking about IBM pausing hiring for thousands of jobs that AI could replace. "That gives the ability to plow a lot more investment into value-creating activities...We hired more people than were let go because we're hiring into areas where there is a lot more demand from our clients."

Businesses

IBM Chief's Message To Remote Workers: 'Your Career Does Suffer' (bloomberg.com) 184

IBM CEO Arvind Krishna said he's not forcing any of the company's remote workers to come into the office just yet, but warns those who don't "would be hard-pressed to get promoted, especially into managerial roles," reports Bloomberg. From the report: "Being a people manager when you're remote is just tough because if you're managing people, you need to be able to see them once in a while," he said in an interview Monday in New York. "It doesn't need to be every minute. You don't need to function under those old 'Everybody's under my eye' kind of rules, but at least sometimes." "It seems to me that we work better when we are together in person," said Krishna, who described the company's return-to-office policy as "we encourage you to come in, we expect you to come in, we want you to come in." Three days a week is the number they encourage, he said.

While about 80% of IBM's employees work from home at least some of the time, Krishna said remote arrangements are best suited for specific "individual contributor" roles like customer service or software programmers. "In the short term you probably can be equally productive, but your career does suffer," he said. "Moving from there to another role is probably less likely because nobody's observing them in another context. It will be tougher. Not impossible, but probably a lot tougher."

Krishna, who became CEO right after the pandemic hit in April 2020, said people make a choice to work remotely, but it need not be "a forever choice -- it could be a choice based on convenience or circumstance." Remote workers, he said, don't learn how to do things like deal with a difficult client, or how to make trade-offs when designing a new product. "I don't understand how to do all that remotely," he said.

Businesses

IBM To Pause Hiring In Plan To Replace 7,800 Jobs With AI 129

IBM CEO Arvind Krishna told Bloomberg that it expects to pause hiring for roles as roughly 7,800 jobs could be replaced by AI in the coming years. Reuters reports: Hiring specifically in back-office functions such as human resources will be suspended or slowed, Krishna said, adding that 30% of non-customer-facing roles could be replaced by AI and automations in five years. The reduction could include not replacing roles vacated by attrition, the PC-maker told the publication.
Open Source

Red Hat's 30th Anniversary: How a Microsoft Competitor Rose from an Apartment-Based Startup (msn.com) 47

For Red Hat's 30th anniversary, North Carolina's News & Observer newspaper ran a special four-part series of articles.

In the first article Red Hat co-founder Bob Young remembers Red Hat's first big breakthrough: winning InfoWorld's "OS of the Year" award in 1998 — at a time when Microsoft's Windows controlled 85% of the market. "How is that possible," Young said, "that one of the world's biggest technology companies, on this strategically critical product, loses the product of the year to a company with 50 employees in the tobacco fields of North Carolina?" The answer, he would tell the many reporters who suddenly wanted to learn about his upstart company, strikes at "the beauty" of open-source software.

"Our engineering team is an order of magnitude bigger than Microsoft's engineering team on Windows, and I don't really care how many people they have," Young would say. "Like they may have thousands of the smartest operating system engineers that they could scour the planet for, and we had 10,000 engineers by comparison...."

Young was a 40-year-old Canadian computer equipment salesperson with a software catalog when he noticed what Marc Ewing was doing. [Ewing was a recent college graduate bored with his two-month job at IBM, selling customized Linux as a side hustle.] It's pretty primitive, but it's going in the right direction, Young thought. He began reselling Ewing's Red Hat product. Eventually, he called Ewing, and the two met at a tech conference in New York City. "I needed a product, and Marc needed some marketing help," said Young, who was living in Connecticut at the time. "So we put our two little businesses together."

Red Hat incorporated in March 1993, with the earliest employees operating the nascent business out of Ewing's Durham apartment. Eventually, the landlord discovered what they were doing and kicked them out.

The four articles capture the highlights. ("A visual effects group used its Linux 4.1 to design parts of the 1997 film Titanic.") And it doesn't leave out Red Hat's skirmishes with Microsoft. ("Microsoft was owned by the richest person in the world. Red Hat engineers were still linking servers together with extension cords. ") "We were changing the industry and a lot of companies were mad at us," says Michael Ferris, Red Hat's VP of corporate development/strategy. Soon there were corporate partnerships with Netscape, Intel, Hewlett-Packard, Compaq, Dell, and IBM — and when Red Hat finally goes public in 1999, its stock sees the eighth-largest first-day gain in Wall Street history, rising in value in days to over $7 billion and "making overnight millionaires of its earliest employees."

But there's also inspiring details like the quote painted on the wall of Red Hat's headquarters in Durham: "Every revolution was first a thought in one man's mind; and when the same thought occurs to another man, it is the key to that era..." It's fun to see the story told by a local newspaper, with subheadings like "It started with a student from Finland" and "Red Hat takes on the Microsoft Goliath."

Something I'd never thought of. 2001's 9/11 terrorist attack on the World Trade Center "destroyed the principal data centers of many Wall Street investment banks, which were housed in the twin towers. With their computers wiped out, financial institutions had to choose whether to rebuild with standard proprietary software or the emergent open source. Many picked the latter." And by the mid-2000s, "Red Hat was the world's largest provider of Linux...' according to part two of the series. "Soon, Red Hat was servicing more than 90% of Fortune 500 companies." By then, even the most vehement former critics were amenable to Red Hat's kind of software. Microsoft had begun to integrate open source into its core operations. "Microsoft was on the wrong side of history when open source exploded at the beginning of the century, and I can say that about me personally," Microsoft President Brad Smith later said.

In the 2010s, "open source has won" became a popular tagline among programmers. After years of fighting for legitimacy, former Red Hat executives said victory felt good. "There was never gloating," Tiemann said.

"But there was always pride."

In 2017 Red Hat's CEO answered questions from Slashdot's readers.
Red Hat Software

Red Hat Begins Cutting 'Hundreds of Jobs' (phoronix.com) 49

According to Phoronix citing multiple local North Carolina news outlets, Red Hat is cutting "hundreds of jobs" in an initial round of layoffs announced today. From the report: According to WRAL, Red Hat CEO Matt Hicks is said to have told employees in an email "we will not reduce roles directly selling to customers or building our products," which is hopefully good news for their many upstream Linux developers they employ that ultimately build Red Hat Enterprise Linux and associated software products. Red Hat will begin notifying affected employees today in some countries while the process will continue through the end of the quarter. IBM, which acquired Red Hat in 2019, has already slashed some five thousand positions so far in 2023.
Virtualization

QEMU 8.0 Released with More ARM and RISC-V Emulation (9to5linux.com) 23

There's a major new update of QEMU, the open-source machine emulator, reports 9to5Linux: Coming a year after QEMU 7.0, the QEMU 8.0 release is here to improve support for ARM and RISC-V architectures.

- For ARM, it adds emulation support for FEAT_EVT, FEAT_FGT, and AArch32 ARMv8-R, CPU emulation for Cortex-A55 and Cortex-R52, support for a new Olimex STM32 H405 machine type, as well as gdbstub support for M-profile system registers.

- For the RISC-V architecture, QEMU 8.0 brings updated machine support for OpenTitan, PolarFire, and OpenSBI, additional ISA and Extension support for smstateen, native debug icount trigger, cache-related PMU events in virtual mode, Zawrs/Svadu/T-Head/Zicond extensions, and ACPI support. Moreover, RISC-V received multiple fixes covering PMP propagation for TLB, mret exceptions, uncompressed instructions, and other emulation/virtualization improvements.

Improvements were also made for the s390x (IBM Z) platform, the HP Precision Architecture (HPPA) platform, and x86.
United States

A Tech Industry Pioneer Sees a Way for the US To Lead in Advanced Chips (nytimes.com) 17

Ivan Sutherland played a key role in foundational computer technologies. Now he sees a path for America to claim the mantle in "superconducting" chips. From a report: It has been six decades since Ivan Sutherland created Sketchpad, a software system that foretold the future of interactive and graphical computing. In the 1970s, he played a role in rallying the computer industry to build a new type of microchip with hundreds of thousands of circuits that would become the foundation of today's semiconductor industry. Now Dr. Sutherland, who is 84, believes the United States is failing at a crucial time to consider alternative chip-making technologies that would allow the country to reclaim the lead in building the most advanced computers.

By relying on supercooled electronic circuits that switch without electrical resistance and as a consequence generate no excess heat at higher speeds, computer designers will be able to circumvent the greatest technological barrier to faster machines, he claims. "The nation that best seizes the superconducting digital circuit opportunity will enjoy computing superiority for decades to come," he and a colleague recently wrote in an essay that circulated among technologists and government officials. Dr. Sutherland's insights are significant partly because decades ago he was instrumental in helping to create today's dominant approach to making computer chips.

In the 1970s, Dr. Sutherland, who was chairman of the computer science department at the California Institute of Technology, and his brother Bert Sutherland, then a research manager at a division of Xerox called the Palo Alto Research Center, introduced the computer scientist Lynn Conway to the physicist Carver Mead. They pioneered a design based on a type of transistor, known as complementary metal-oxide semiconductor, or CMOS, which was invented in the United States. It made it possible to manufacture the microchips used by personal computers, video games and the vast array of business, consumer and military products. Now Dr. Sutherland is arguing that an alternative technology that predates CMOS, and has had many false starts, should be given another look. Superconducting electronics was pioneered at the Massachusetts Institute of Technology in the 1950s and then pursued by IBM in the 1970s before being largely abandoned. At one point, it even made an odd international detour before returning to the United States.

IBM

GlobalFoundries Sues IBM, Says Trade Secrets Were Unlawfully Given To Japan's Rapidus (reuters.com) 4

Chip manufacturer GlobalFoundries said today it had filed a lawsuit against IBM, accusing it of unlawfully sharing confidential intellectual property and trade secrets. From a report: New York-based GlobalFoundries said in its complaint that IBM had shared IP and trade secrets with Rapidus, a new state-backed Japanese consortium that IBM is working with to develop and produce cutting-edge two-nanometre chips. It also asserted that IBM had unlawfully disclosed and misused its IP with Intel, noting that IBM had announced in 2021 it would collaborate with Intel on next-generation chip technology. "IBM is unjustly receiving potentially hundreds of millions of dollars in licensing income and other benefits," GlobalFoundries said in a statement. IBM pushed back in an emailed statement to Reuters saying: "GlobalFoundries filed this meritless lawsuit after a court rejected the company's attempt to dismiss IBM's legitimate fraud and breach of contract claims. Their allegations are entirely baseless, and we are confident that the court will agree."
Education

Should Managers Permanently Stop Requiring Degrees for IT Positions? (cio.com) 214

CIO magazine reports on "a growing number of managers and executives dropping degree requirements from job descriptions." Figures from the 2022 study The Emerging Degree Reset from The Burning Glass Institute quantify the trend, reporting that 46% of middle-skill and 31% of high-skill occupations experienced material degree resets between 2017 and 2019. Moreover, researchers calculated that 63% of those changes appear to be "'structural resets' representing a measured and potentially permanent shift in hiring practices" that could make an additional 1.4 million jobs open to workers without college degrees over the next five years.

Despite such statistics and testimony from Taylor and other IT leaders, the debate around whether a college education is needed in IT isn't settled. Some say there's no need for degrees; others say degrees are still preferred or required.... IBM is among the companies whose leaders have moved away from degree requirements; Big Blue is also one of the earliest, largest, and most prominent proponents of the move, introducing the term "new collar jobs" for the growing number of positions that require specific skills but not a bachelor's degree....

Not all are convinced that dropping degree requirements is the way to go, however. Jane Zhu, CIO and senior vice president at Veritas Technologies, says she sees value in degrees, value that isn't always replicated through other channels. "Though we don't necessarily require degrees for all IT roles here at Veritas, I believe that they do help candidates demonstrate a level of formal education and commitment to the field and provide a foundation in fundamental concepts and theories of IT-related fields that may not be easily gained through self-study or on-the-job training," she says. "Through college education, candidates have usually acquired basic technical knowledge, problem-solving skills, the ability to collaborate with others, and ownership and accountability. They also often gain an understanding of the business and social impacts of their actions."

The article notes an evolving trend of "more openness to skills-based hiring for many technical roles but a desire for a bachelor's degree for certain positions, including leadership." (Kelli Jordan, vice president of IBMer Growth and Development tells CIO that more than half of the job openings posted by IBM no longer require degrees.)

Thanks to Slashdot reader snydeq for sharing the article.
Red Hat Software

Biggest Linux Company of Them All Still Pushing To Become Cloud Power (theregister.com) 23

An anonymous reader shares a report: For Red Hat, which turned 30 on March 27, it was a cause for celebration. From a business that got started in one of its co-founder's wife's sewing room, it became the first billion-dollar pure-play open-source company and then the engine driving IBM. It has been a long strange trip. Sure, today, the tech world is dominated by Linux and open source software, but in 1993, Linux was merely an obscure operating system known only to enthusiasts. Red Hat played a significant role in transforming the "just a hobby" operating system into today's major IT powerhouse. Red Hat co-founder Bob Young, who previously ran a rental typewriter business, was one of those who became intrigued by Linux. In 1993, he established ACC Corporation, a catalog company that distributed Slackware Linux CDs and open-source software.

[...] In 2003, Paul Cormier, then Red Hat's vice president of engineering and now the company's chairman, spearheaded the shift from the inexpensive prosumer Red Hat Linux distribution to the full business-oriented Red Hat Enterprise Linux (RHEL). At the time, many Linux users hated the idea. Even inside Red Hat, Cormier said that many engineers were initially opposed to the new business model, causing some to leave the company while others stayed. The change also upset many users who felt Red Hat was abandoning its original customers. However, enterprise clients had a different perspective. Whitehurst, who became Red Hat CEO in 2008, said, "Once RHEL was in the market, we had to fully support it to make it truly consumable for the enterprise." They succeeded, and Red Hat continued to grow. This is the model that turned Red Hat into the first billion-dollar-a-quarter pure open-source company. Impressive for a business built around an operating system once considered suitable only for the "lunatic fringe." Then, in 2018, IBM acquired Red Hat for a cool $34 billion. There was nothing crazy about that move.

[...] Another change that was already present in Red Hat, a shift towards supporting the cloud, has accelerated. Today, while RHEL remains the heart of the business, the Linux-powered cloud has become increasingly important. In particular, Red Hat OpenShift, its Kubernetes-powered hybrid cloud application platform, is more important than ever. Where does Red Hat go from here? When I last talked to Cormier and Red Hat's latest CEO, Matt Hicks, they told me that they'd keep moving forward with the hybrid cloud. After all, as Cormier pointed out, "the cloud wouldn't be here" without Linux and open source. As for Red Hat's relationship with IBM, Cormier said, "The red lines were red, and the blue lines were blue, and that will stay the same."

IBM

IBM Explores Sale of Weather Business (wsj.com) 32

International Business Machines is exploring a sale of its weather operation, WSJ reported, citing people familiar with the matter, as the technology company seeks to streamline its operations. From a report: An auction of the business is at an early stage, the people said, and there may not be a deal. Should there be one, private-equity is most likely the buyer in a deal that could be valued at more than $1 billion, the people said. IBM agreed to buy the business in 2015, purchasing The Weather Company's business-to-business, mobile and cloud-based businesses including Weather.com, which provides weather forecasts around the globe. The deal price at the time was pegged at more than $2 billion.

The Weather Channel wasn't part of the deal, but agreed to license weather-forecast data and analytics from IBM. The deal was part of a push by IBM to use its cloud infrastructure to provide accurate weather forecasts and help companies control costs. The business issues more than 25 billion forecasts a day, according to the company's website. A sale of the weather unit would be a part of a broader push by IBM to streamline its operations as the once-dominant company's shares languish near levels they traded at more than 20 years ago.

IBM

New Models of IBM Model F Keyboard Mark II Incoming (theregister.com) 46

An anonymous reader quotes a report from Ars Technica: What's even harder-core than the IBM Model M? The Model F, the keyboard that launched alongside the IBM PC in 1981. After a 2017 relaunch, new models with the original layout are here. The project, which back in 2017 relaunched a modern keyboard inspired by a compact space-saver version of IBM's classic Model F, is launching its second generation of brand-new premium input devices, and this time, various layouts will be available. [...]

Enter the New Model F Keyboards project. "Ellipse" launched it in 2017 and attracted over $300,000 worth of orders, even at $399 each. Aside from the not-inconsiderable price, what put the author off was the layout. Space-saving and reduced-footprint keyboards are very popular among serious keyboard collectors, and the project chose two space-saver layouts from IBM's 4704 terminal, dubbed the Kishsaver after the collector who described it. The F77 layout has a numeric keypad, but no function keys; the even smaller F62 layout omits the keypad, or as the cool kids call it, it's a TKL layout, which we are informed stands for tenkeyless, presumably because it has 15 fewer keys.

Which is why the FOSS desk's bank account would tremble in fear if it were not an inanimate table in a database somewhere, because the Model F project has announced a new range, including full-size and compact 104-key layouts and most appealing to this large and heavy-handed vulture, a replica of the 122-key IBM Battleship, one of which we've been hunting for over a decade. The project occasionally has refurbished original IBM units. Now, though, a brand-new one is a $420 option. If that isn't exclusive enough, your correspondent also working on a model with beam springs, the mechanism from 1970s IBM business products. The first model of the brand new beam spring units is a mere $579.

Security

Ransomware Crooks Are Exploiting IBM File-Exchange Bug With a 9.8 Severity (arstechnica.com) 18

Threat actors are exploiting a critical vulnerability in an IBM file-exchange application in hacks that install ransomware on servers, security researchers have warned. From a report: The IBM Aspera Faspex is a centralized file-exchange application that large organizations use to transfer large files or large volumes of files at very high speeds. Rather than relying on TCP-based technologies such as FTP to move files, Aspera uses IBM's proprietary FASP -- short for Fast, Adaptive, and Secure Protocol -- to better utilize available network bandwidth. The product also provides fine-grained management that makes it easy for users to send files to a list of recipients in distribution lists or shared inboxes or workgroups, giving transfers a workflow that's similar to email.

In late January, IBM warned of a critical vulnerability in Aspera versions 4.4.2 Patch Level 1 and earlier and urged users to install an update to patch the flaw. Tracked as CVE-2022-47986, the vulnerability makes it possible for unauthenticated threat actors to remotely execute malicious code by sending specially crafted calls to an outdated programming interface. The ease of exploiting the vulnerability and the damage that could result earned CVE-2022-47986 a severity rating of 9.8 out of a possible 10. On Tuesday, researchers from security firm Rapid7 said they recently responded to an incident in which a customer was breached using the vulnerability.

AI

Developer Builds a ChatGPT Client for MS-DOS (yeokhengmeng.com) 54

"With the recent attention on ChatGPT and OpenAI's release of their APIs, many developers have developed clients for modern platforms to talk to this super smart AI chatbot," writes maker/retro coding enthusiast yeokm1 . "However I'm pretty sure almost nobody has written one for a vintage platform like MS-DOS."

They share a blog post with all the details — including footage of their client ultimately running on a vintage IBM PC from 1984 (with a black and orange monitor and those big, boxy keys). "3.5 years ago, I wrote a Slack client to run on Windows 3.1," the blog post explains. "I thought to try something different this time and develop for an even older platform as a challenge."

One challenge was just finding a networking API for DOS. But everything came together, with the ChatGPT-for-DOS app written using Visual Studio Code text editor (testing on a virtual machine running DOS 6.22), parsing the JSON output from OpenAI's Chat Completion API. "And before you ask, I did not use ChatGPT for help to code this app in any way," the blog post concludes. But after the app was working, he used it to ask ChatGPT how one would build such an app — and ChatGPT erroneously suggested breezily that he just try accessing OpenAI's Python API from the DOS command line.

"What is the AI smoking...?"
IBM

IBM Installs World's First Quantum Computer for Accelerating Healthcare Research (insidehpc.com) 44

It's one of America's best hospitals — a nonprofit "academic medical center" called the Cleveland Clinic. And this week it installed an IBM-managed quantum computer to accelerate healthcare research (according to an announcement from IBM). IBM is calling it "the first quantum computer in the world to be uniquely dedicated to healthcare research."

The clinic's CEO said the technology "holds tremendous promise in revolutionizing healthcare and expediting progress toward new cares, cures and solutions for patients." IBM's CEO added that "By combining the power of quantum computing, artificial intelligence and other next-generation technologies with Cleveland Clinic's world-renowned leadership in healthcare and life sciences, we hope to ignite a new era of accelerated discovery."

em>Inside HPC points out that "IBM Quantum System One" is part of a larger biomedical research program applying high-performance computing, AI, and quantum computing, with IBM and the Cleveland Clinic "collaborating closely on a robust portfolio of projects with these advanced technologies to generate and analyze massive amounts of data to enhance research." The Cleveland Clinic-IBM Discovery Accelerator has generated multiple projects that leverage the latest in quantum computing, AI and hybrid cloud to help expedite discoveries in biomedical research. These include:

- Development of quantum computing pipelines to screen and optimize drugs targeted to specific proteins;

- Improvement of a quantum-enhanced prediction model for cardiovascular risk following non-cardiac surgery;

- Application of artificial intelligence to search genome sequencing findings and large drug-target databases to find effective, existing drugs that could help patients with Alzheimer's and other diseases.


The Discovery Accelerator also serves as the technology foundation for Cleveland Clinic's Global Center for Pathogen & Human Health Research, part of the Cleveland Innovation District. The center, supported by a $500 million investment from the State of Ohio, Jobs Ohio and Cleveland Clinic, brings together a team focused on studying, preparing and protecting against emerging pathogens and virus-related diseases. Through the Discovery Accelerator, researchers are leveraging advanced computational technology to expedite critical research into treatments and vaccines.

Earth

Chipmakers Fight Spread of US Crackdowns on 'Forever Chemicals' 37

Intel and other semiconductor companies have joined together with industrial materials businesses to fight US clampdowns on "forever chemicals," substances used in myriad products that are slow to break down in the environment. From a report: The lobbying push from chipmakers broadens the opposition to new rules and bans for the chemicals known as PFAS. The substances have been found in the blood of 97 per cent of Americans, according to the US government. More than 30 US states this year are considering legislation to address PFAS, according to Safer States, an environmental advocacy group. Bills in California and Maine passed in 2022 and 2021, respectively.

"I think clean drinking water and for farmers to be able to irrigate their fields is far more important than a microchip," said Stacy Brenner, a Maine state senator who backed the state's bipartisan legislation. In Minnesota, bills would ban by 2025 certain products that contain added PFAS -- which is short for perfluoroalkyl and polyfluoroalkyl substances -- in legislation considered to be some of the toughest in the country. The Semiconductor Industry Association -- whose members include Intel, IBM and Nvidia -- has cosigned letters opposing the Minnesota legislation, arguing its measures are overly broad and could prohibit thousands of products, including electronics. Chipmakers also opposed the California and Maine laws.
Programming

Programming Pioneer Grady Booch on Functional Programming, Web3, and Conscious Machines (infoworld.com) 76

InfoWorld interviews Grady Booch, chief scientist for software engineering at IBM Research (who is also a pioneer in design patterns, agile methods, and one of the creators of UML).

Here's some of the highlights: Q: Let me begin by asking something "of the moment." There has been an almost cultural war between object-oriented programming and functional programming. What is your take on this?

Booch: I had the opportunity to conduct an oral history with John Backus — one of the pioneers of functional programming — in 2006 on behalf of the Computer History Museum. I asked John why functional programming didn't enter the mainstream, and his answer was perfect: "Functional programming makes it easy to do hard things" he said, "but functional programming makes it very difficult to do easy things...."


Q: Would you talk a bit about cryptography and Web3?

Booch: Web3 is a flaming pile of feces orbiting a giant dripping hairball. Cryptocurrencies — ones not backed by the full faith and credit of stable nation states — have only a few meaningful use cases, particularly if you are a corrupt dictator of a nation with a broken economic system, or a fraud and scammer who wants to grow their wealth at the expense of greater fools. I was one of the original signatories of a letter to Congress in 2022 for a very good reason: these technologies are inherently dangerous, they are architecturally flawed, and they introduce an attack surface that threatens economies....


Q: What do you make of transhumanism?

Booch: It's a nice word that has little utility for me other than as something people use to sell books and to write clickbait articles....


Q: Do you think we'll ever see conscious machines? Or, perhaps, something that compels us to accept them as such?

Booch: My experience tells me that the mind is computable. Hence, yes, I have reason to believe that we will see synthetic minds. But not in my lifetime; or yours; or your children; or your children's children. Remember, also, that this will likely happen incrementally, not with a bang, and as such, we will co-evolve with these new species.

Slashdot Top Deals