The US Office of the Secretary of Defense for Acquisition & Sustainment (OSD A&S) released its “Fiscal Year 2020 Industrial Capabilities Report to Congress” a few weeks ago. It focuses on the industrial base supporting the US armed forces broken down by both industrial sector and technology area. Of note, many critical “miniature” technologies were discussed with significant foreign dependency risk archetypes.
A Super-Short History of Silicon Valley
We talked about semiconductors a few weeks ago, but a history lesson is needed to understand why we are facing such a shortage and shrinking industrial base. Let’s go back in time to understand how the semiconductor industry took hold in the US starting in the 1950s.
1950s - The transistor, the foundational component of integrated circuits, was brought home to the Santa Clara Valley through the formation of the Shockley Semiconductor Laboratory by William Shockley in 1955. In 1957, the traitorous eight, including Gordon Moore (future founder of Intel), left the lab to start Fairchild Semiconductor. The US Navy, Air Force, and NASA took notice and set-up nearby in order to respond to Russia’s Sputnik. Silicon Valley firms begin to land large government contracts to supply the military with high performance integrated circuits.
1960s - The military kept the pressure on improving process technology to make semiconductors more reliable and smaller for military applications. NASA was said to be buying 60 percent of all integrated circuits produced within the US in the mid-1960s. By the end of the decade the cost of a integrated circuit reduced from ~$30 to ~$1.
1970s - Gordon Moore and Intel create the world’s first microprocessor chip in 1971 and make subsequent advances throughout the decade. The venture capital industry begins to take hold with the arrival of Kleiner Perkins and Sequoia Capital and culminates the decade with the IPO of Apple Computer. NASA, the US Air Force, and ARPA, conceive of ARPANET the precursor to the Internet.
1980s - Key software tools, designs, and protocols are developed to make use of microprocessors which have become smaller and use less power. Hardware (integrated circuits and microprocessors) designs and architectures continue to proliferate.
1990s - Venture capital starts flowing primarily to software firms that focus on Internet applications rather than hardware designs. US government involvement in the Silicon Valley microelectronics ecosystem largely evaporates.
2000s - The investment focus remains on software, as hardware companies struggle to overcome the large investment required to create foundries and new semiconductor manufacturing processes and pure-play software business models are identified.
2010s - The remaining national champions of semiconductor technology begin to show cracks (Intel) or outsource the manufacture of their chips completely (Qualcomm, NVIDIA, AMD).
Why Does this Matter?
The key takeaway away is that, “The government’s willingness to take risks on new technology and to promote its use were significant drivers in creating a strong industrial base in microelectronics” wrote Anna Slomovic while at the RAND corporation in a 1988(!) paper. Government investments spurred the development of thousands of small businesses and led to the creation of a few global behemoths and other countries quickly copied this model!
Now in the 2020s, the co-leading semiconductor foundry manufacturing technology nations such as Korea, Taiwan, and Japan have caught up to the US through the use of government investment over decades. In fact, “Taiwan Semiconductor Manufacturing Company (TSMC), the world’s largest semiconductor foundry, has echoed the government’s push to localize supply chains. The company seeks to increase its procurement of raw materials from domestic suppliers to 64 percent by 2030, 40 percent for backend equipment, and 60 percent for components” according to Taiwan News. The United States needs significantly more than a ‘Made in America’ Executive Order to maintain its manufacturing pre-eminence as China’s foreign direct investment eclipses the US for the first time ever in 2020.
How Tesla Builds Batteries So Fast
AI in Manufacturing: How It's Used and Why It's Important for Future Factories
The fully autonomous factory has always been a provocative vision, much used in speculative fiction. It’s a place that’s nearly unmanned and run entirely by artificial intelligence (AI) systems directing robotic production lines. But this is unlikely to be the way AI will be employed in manufacturing within the practical planning horizon.
The realistic conception of AI in manufacturing looks more like a collection of applications for compact, discrete systems that manage specific manufacturing processes. They will operate more or less autonomously and respond to external events in increasingly intelligent and even humanlike ways—events ranging from a tool wearing out, a system outage, or a fire or natural disaster.
This Startup's Software Programs Industrial Robots, Without Coding
Singapore-based startup Augmentus, founded by IEEE Member Daryl Lim, Yong Shin Leong, and Chong Voon Foo, is trying to make automation more accessible with its intuitive robot-programming platform.
By making industrial robots easier to program, Lim says, the software can help businesses increase efficiency and reduce costs—which would in turn help retain local manufacturing jobs.
Industry 4.0 Solves The Billion-Dollar Misalignment Problem In Electronics Supply Chain
Electronics manufacturing loses billions of dollars every year due to misaligned incentives within the supply chain. These misalignments fester under the surface leading to suboptimal results: lower margins, late shipments, and lower trust relationships with suppliers.
But the most visionary supply chain and manufacturing leaders are realizing that Industry 4.0 and Smart Manufacturing technologies, traditionally billed as increasing productivity and increasing Overall Equipment Effectiveness (OEE) are a secret weapon they can use to drive cultural change that corrects these misalignments. They are pushing these technologies to do double-duty: driving both the core efficiency improvements and setting a new culture around them. By reevaluating the misaligned incentives that have developed in their supply chains over decades, these leaders are breaking the mold, empowering their employees, and driving results that are saving their companies tens of millions of dollars or more each year.
IoT Supply Chain Vulnerability Poses Threat to IIoT Security
Most companies that construct products with the aid of IIoT-based operations are likely to keep close tabs on the supply chain that provides a predictable stream of raw materials and services that allows them to crank out products and keep the business humming.
But a second, underlying supply chain receives less scrutiny. And if the security of that supply chain is somehow compromised, business could grind to a halt.
That overlooked supply chain delivers the components that build out an IIoT infrastructure. The purchaser of those devices is at the end of the supply chain that — from a security perspective — lacks sufficient transparency into the chain. In fact, it would be a challenge to track the origins of the internal elements that comprise the delivered IIoT devices.
Boston Dynamics' Spot Robot Is Now Armed
The quadruped robot can now use an arm to interact with its environment semi-autonomously.
So the real question about this arm is whether Boston Dynamics has managed to get it to a point where it’s autonomous enough that users with relatively little robotics experience will be able to get it to do useful tasks without driving themselves nuts.
Edge-Inference Architectures Proliferate
What makes one AI system better than another depends on a lot of different factors, including some that aren’t entirely clear.
The new offerings exhibit a wide range of structure, technology, and optimization goals. All must be gentle on power, but some target wired devices while others target battery-powered devices, giving different power/performance targets. While no single architecture is expected to solve every problem, the industry is in a phase of proliferation, not consolidation. It will be a while before the dust settles on the preferred architectures.
Advantages of Migrating to Cloud for Enterprise Analytics Environment
We are a data team. We spend the bulk of our efforts building out data pipelines from operational systems into our Decision Support infrastructure. We synthesize the analytical data assets from operational data flow and publish these assets for consumption across the enterprise. Our ETL pipelines are built using an in-house ETL framework with workflows that run on Map Reduce and tuned with TEZ parameters and some workloads using Apache Spark. Data flows through a series of logical stages from various sources across the organization into a “Raw Zone”,” Cleansed”, and “Transformed” to build multiple fact tables suitable for the Enterprise team’s use-cases. The data is then flattened and loaded to the consumption layers for ease of business analysis and reporting. These works might be common among most of the companies today, and we hope that our story about overcoming a series of challenges through a cloud migration resonates with you and your teams.