« prev   random   next »


What’s Happening to Technological Progress?

By cmdrda2leak follow cmdrda2leak   2019 Feb 21, 8:43am 2,231 views   17 comments   watch   nsfw   quote   share    

Barely a day goes by that we are not confronted with another warning of impending catastrophe. If it’s not another global financial crisis, a nuclear war or Global Warming, the next pandemic is surely going to get us and destroy all we know and hold dear. Until recently, one standard response to such fears was to express belief in the ever-increasing power of technology to get us out of any number of tight spots.

Will technological progress keep us going, helping us cope with the challenges that our over-complex societies seem to generate as a side effect? I’m not so sure anymore. Some of those ageing science fiction fans of the baby boomer generation are getting antsy: We were promised robots, as Shorvon Bhattacharya wrote a few years back. Where are those robots now?

We’re blinded by incremental progress in electronic gadgets of marginal utility—new smartphones, larger monitors, and more powerful computers. Yet we drive vehicles with internal combustion engines, our electricity is mostly generated in thermal power plants fueled by coal, we are far from curing solid cancers, and we are slowly losing the race against multi-resistant bacteria. The average Sci-Fi writer of the 50s, 60s and 70s would be very, very disappointed with the world of 2018. There is no colony on the moon, we don’t have fusion power, and there are no laser guns or sonic disruptors. No robots anywhere. We can’t even cure the common cold.

It is painfully obvious that technological progress between, say, 1968 and now, can’t compare to the periods between 1818 and 1868; 1868 and 1918; or 1918 to 1968. This will very likely be recognized by the 50th anniversary of the first Moon landing in 2019, when lots of people will start to wonder why it is that, 50 years after the greatest triumph of the Apollo programme, we don’t even have the capacity to repeat the technological feats of the late ’60s.

A particularly depressing aspect of this downturn in research productivity was recently analysed by Patrick Collison and Michael Nielsen in the pages of The Atlantic. They show that, despite an enormous increase in research activity over the last 50 years, there has been a marked decrease in return on investment, We are getting less bang for our buck. Major innovation is becoming increasingly scarce in the natural sciences, and most Nobel Prizes in the Sciences are handed out for research that was done decades ago.

Something fundamental seems to have gone wrong. In fact, several issues are combining to slow down research and development. My argument is that the research enterprise is in serious trouble.


At its core, all civilization is about problem-solving. The more successful a society is at that, the more stable is it likely to be. When faced with new problems, we use methods that worked in the past. Traditionally, our problems were solved by creating new structures: committees, commissions of inquiry, new rules, laws and regulations and centralization. So why is it that we now commonly seem to make problems worse whenever that approach is used?

In my neck of the woods, the New South Wales public health system is a good example of a complex system beset by multiple problems. A decade ago a report called the Garland Report was released demonstrating the utter futility of the red-tape approach to problem-solving. This means we are investing in ways that generated positive returns in the past, but are now useless; or if not useless, potentially harmful. Every year we spend more money on managers rather than on doctors or nurses, and even those doctors and nurses spend more and more time with non-productive tasks such as filling in paperwork. We now need seven or eight signatures to authorize advertising for a part-time admin assistant. Research governance has become a nightmare, and compliance costs are increasing steadily. This puts young colleagues off academic careers and has forced my unit to outsource research to Houston, San Francisco, Pretoria, Santiago de Chile, Prague and Las Palmas.

Investment in complexity that produces negative returns is a sure sign of a complex system that is ‘brittle,’ or, likely to fail catastrophically, as explained by Crawford Holling, one of the founders of the academic discipline of ‘social ecology’.

As if this wasn’t bad enough, there are clearly a number of societal factors that reinforce the effects of excessive complexity, and it is difficult to not see them as bound up with other shifts in cultural and moral norms. Western societies have become increasingly ‘feminine,’ with women achieving a much greater share of control over institutions and cultural activities. There have been trade-offs incurred from this process, naturally. The last 200 years have seen a shift from what moral sociologists have described as the ‘honour’ or warrior culture pre-dating the 19th-century—to the ‘dignity’ culture of the 20th-century—to the emergence of a ‘victimhood’ culture in the 21st.

Part of this shift includes the increasing adversity to risk. This risk aversion is sometimes crudely described as the ‘Nanny State,’ but at its core, it is defined by increasing regulation of all aspects of life, both public and private. Regulation, on principle, is about protecting the weak from the strong and the reduction of risk in general—the risk of drowning in a backyard pool, of poisoning from household chemicals, or the risk of dying in a crash due to faulty car components. Clearly, many of these incremental steps are positive advancements. And we’ve been making progress in this regard throughout the existence of our civilization. Yet increasing risk aversion also comes at a cost.

In Thinking-Fast and Slow Daniel Kahneman provides an explanation of this fundamental property of the human mind. Risk aversion is the aversion to trading any kind of risk for some other advantage. It is strong in most people, and possibly more so amongst women than in men. Of course, there have always been unusual individuals that approach risk rather than avoid it. People low in risk aversion drove the industrial revolution and have driven technological progress in general. Modern medicine is what it is today because researchers in the past took risks, sometimes for themselves, and usually for their patients, and sometimes without telling them. If we were to apply today’s rules to the biomedical breakthroughs of the last 200 years, many of them would be considered unethical and illegal. We may condemn this history from our modern perspective, but that’s how we got to where we are today.

If current rules and regulations had been in existence in the 1900s and the first half of the 20th century we would not have airplanes, air conditioning, antibiotics, automobiles, chlorine, the measles and smallpox vaccines, cardiac catheters, open heart surgery, radio, refrigeration and X-rays. The universal principle of risk aversion in the past hampered only individuals, and if only one individual in a million was immune (that is, if one in a million did not share this risk aversion) that was enough for progress to occur. Now, this risk aversion is firmly entrenched in legislation all over the world, and it is throttling innovation, leading to actions that Kahneman describes as “detrimental to the wealth of individuals, to the soundness of policy, and to the welfare of society.” The bottom line is that risk aversion is fundamentally bad for productivity, and especially for research and development.

At the same time that risk aversion is increasing, discrimination on the basis of ethnicity or gender also seems to become more and more acceptable in Western societies. Harvard is fighting a court case brought by Asian students accusing the Ivy League University of systematic discrimination on the basis of race. Melbourne University recently advertised for a Professor of Mathematics, stating that the School would “only consider applications from suitably qualified female candidates.” A senior physicist presenting empirical data on the hypothesis that female academics now tend to be appointed to senior positions with lower publication and citation counts—suggesting systemic anti-male discrimination—was recently suspended from his job at CERN in Switzerland. Clearly, discriminating against individuals for any reason other than their work capabilities is bad for productivity–in business as well as the research enterprise.

Thirdly, the innovation incubators of our world, the universities and their counterparts in business, are increasingly subject to a societal climate that is hostile to free speech, viewpoint diversity and open inquiry. Karl Popper, the foremost philosopher of science, once stated that “the growth of knowledge depends entirely upon disagreement,” yet disagreement with a growing number of orthodoxies is becoming increasingly dangerous. The reaction to Professor Alessandro Strumia’s talk at CERN is an example of how the presentation of arcane empirical data can result in what is colloquially known as a ‘shit storm’. James Damore’s firing from Google is another example. Clearly, it is easy to mortally offend people by reporting bibliometric research today—something I would have considered absurdly, ludicrously unlikely in the past.

In fact, this trend is now most certainly affecting scholarly activity internationally. In his essay ‘The Institutionalisation of Social Justice’ Uri Harris recently described several cases of activists affecting the publication or continued availability of research papers. Research that upsets feminist, LGBTIQ or ‘Black Lives Matter’ activists is high-risk for academics: witness the backlash to Theodore Hill’s paper on the “Greater Male Variability Hypothesis,” what happened to Lisa Littman’s research on Rapid Onset Gender Dysphoria, or Bruce Gilley’s publication on colonialism. It doesn’t matter how obscure the journal, and how technical the paper–self-appointed guardians of morality feel entitled to suppress opinions or research findings they do not agree with. In 2018, social media can actually influence what we can do in our operating theatres. We end up doing medicine that’s increasingly ‘Facebook-based’ rather than evidence-based.

The end result of all these mutually reinforcing trends is a very substantial ongoing reduction in research productivity: less and less bang for more and more buck. And this comes at the very worst possible time for the continuing development of our civilization.

Tyler Cowen, an American economist, has examined technological progress as the main driver of economic growth and identified the increasing scarcity of true innovation as the main cause of ‘the Great Stagnation,’ the slowing of economic growth in developed countries since the early ’70s. He claims that we have picked the ‘low-hanging apples’ of the industrial revolution, and the catch-up growth of developing countries will slow soon enough as well, once they’ve picked all those low-hanging apples such as universal education, mass transportation, and gender equality. Cowen argues that we may have to get used to a prolonged period of slow growth; a time when the cake is simply not growing as much as it did in the past. That fundamental insight explains a lot about what’s been happening in Western societies over the last generation, including increasing conflict over the distribution of income. God only knows what will happen to our societies once people realize that the future may be rather poorer than the past trajectory of our civilization would lead us to expect. In fact, we’re probably seeing the effects of this shift already—identity politics may simply be a manifestation of increasingly aggressive conflict over resource distribution.

The societal factors affecting the research enterprise are becoming ever more damaging at the worst possible time, at a time when we have to reach higher and higher to pluck those apples still remaining on Tyler Cowen’s Tree of Technology. We need more, not less research and development, because our societies have become so complex, so brittle, and increasingly unlikely to be able to weather sudden changes in their conditions. Collapse is becoming more and more likely, as Tim Homer-Dixon has documented in The Upside of Down. Whatever it is: global warming, a pandemic, nuclear war, a computer virus or the next global financial crisis—we need less regulation, not more, and less complex, decentralized systems to weather the coming storm. We also need less discrimination—not more. We need more free speech, not activists and activist media telling us what is an acceptable point of view, and what’s sexist, racist, transphobic, imperialist, misogynistic, fatphobic, ableist, or whatever the latest linguistic abomination may be.

At the very worst possible time in the existence of this civilization, at a time when we’re running out of low-hanging apples to pick, we’re drowning ourselves in red tape and a new authoritarian orthodoxy. That’s very bad news, not just for the research enterprise, but for all of humanity.

1   HunterTits   ignore (4)   2019 Feb 21, 9:37am     ↓ dislike (0)   quote   flag      

Technological advancement is happening at an accelerating pace. People tend to overexpect the pace of change in the short term while underestimate it in the long term because of how humans perceive/experience time, that's all.

One reason why it seems change is 'slow' is because the pace of human adoption is slow.

Social and regulatory adoption of a tech can take much, much slower than actually rolling it out. This is why the eBikes and eScooter folks are following Uber in just pushing into their markets w/o even trying to get permission first. And of course while that worked for Uber, local communities wised up to the tactic and clamped down fast on the eBikes. Likewise, there will be a lot more resistance to widespread SDC adoption than tech utopians want to admit. It will hit in the form of a major public backlash. Coming to Anytown, USA near you!

And then there is just people in general adopting the tech at human pace of learning (yawn!). That takes time.

That's why the Singularity in my view really refers to things coming to a head, amazing tech 'busting out' and changing things so it is unrecognizable and totally chaotic to the existing way of doing things. With that definition, you will note that AGI is not required for that scenario to happen, btw.
2   MisdemeanorRebel   ignore (3)   2019 Feb 21, 9:44am     ↓ dislike (0)   quote   flag      

cmdrdataleak says
A particularly depressing aspect of this downturn in research productivity was recently analysed by Patrick Collison and Michael Nielsen in the pages of The Atlantic. They show that, despite an enormous increase in research activity over the last 50 years, there has been a marked decrease in return on investment, We are getting less bang for our buck. Major innovation is becoming increasingly scarce in the natural sciences, and most Nobel Prizes in the Sciences are handed out for research that was done decades ago.

Don't agree. The fundamentals for the internet were laid out in the late 50s-60s. But it took widespread adaption of the microcomputer followed by DARPAnet and finally academics developing a sharing tool for it to explode. It wasn't all that long - from Vacuum Tube Computers and rudimentary telecom to smartphones with wifi internet and a huge worldwide web happened in the lifetime of one generation.

The problem with space is that it went from a fixed mission race for prestige, to a MIC permahandout with no end old, other than dicking around in LEO and sending out the occasional probe/rover. We're seeing progress resume because Private Industry is bidding on commercial satellites, which were retarded by the boondoggle shuttle program, the successor to the Saturn Series.

That's why the SLS, a modification of the late 70s shuttle booster design, has cost $10B+ over a decade, but hasn't flown a single test mission, but SpaceX has developed a booster-reusable family of rockets for 1/10th that cost in about as much time.
3   Tenpoundbass   ignore (16)   2019 Feb 21, 9:45am     ↓ dislike (0)   quote   flag      

When I was a kid technological advances were happening so fast, we were expected to have Hover Cars by now.

Elon Musk is bankrupting himself trying to keep the illusion that we are now 100 years into the future going.
4   MisdemeanorRebel   ignore (3)   2019 Feb 21, 9:55am     ↓ dislike (0)   quote   flag      


Another hypothesis I heard for the gap between theory and practical application is Risk Aversion. In the early days of aviation, there were 100s of enterprises flying 100s of plane models involving 1000s of deaths. The high rate of failure and analysis enabled aviation to make big leaps, so that in just a few decades, regular and routine air travel was commercially viable. NASA insists on 99% survivability and hates all failures - another reason for the success of SpaceX (they're willing to continually test and improve, regardless of failures).

I'd guess:
* Risk Aversion
* Paying for R&D, rather than demanding a competitive test of demo'd prototypes (ie the F-35 vs. the previous F-Series since WW2)
* Demands for positive conclusions rather than negative conclusions (ie "X isn't a factor" is humdrum... "X is a factor" is rewarded with more money. But eliminating dead ends is as valuable as finding new leads).
5   rocketjoe79   ignore (1)   2019 Feb 21, 10:25am     ↓ dislike (0)   quote   flag      

The Space Shuttle has a risk of catastrophic failure of about 10% on every flight.
"A retrospective risk analysis by the Space Shuttle Safety and Mission Assurance Office finds that the first nine shuttle flights had a 1 in 9 chance of catastrophic failure — 10 times the risk of flights today."

NASA is asking for the new Commercial Crewed flights (SpaceX, ULA) to have a Loss of Crew Risk of 1 in 270.

But since no one can estimate the risk from space junk, they'll have to give everyone a waiver. March on, science.
6   Heraclitusstudent   ignore (2)   2019 Feb 21, 11:38am     ↓ dislike (0)   quote   flag      

People always mistake a clear view for a short distance.

We just went through 3 wrenching decades of changes on the information technology front, and people in the 60s who imagined bases on the moon and fusion power couldn't dream of the stuff we have in a smartphone.
And it's not over yet. Automation will not stop and reach farther and farther in all domains.

The next 10 yrs will see similar astounding changes coming from the biotechnology sector. It takes a critical mass before a sector explodes, and biotechnology is reaching a point where it has: a much better understanding of the underlying mechanisms, much better and cheaper tools, a huge mass of information in the form of sequenced DNA. DNA in biotech is much like software in computers, except DNA builds its own hardware and is thereby incredibly more scalable. We will see a new industrial revolutions that we can't even imagine yet. Cheap bio fuels powered directly from sunlight. Drug produced by cultured cells. Artificial meat. Ultralight and resistant materials. Life span improvements, etc, etc...

And space exploration is probably going to build on all this as well. A base on the moon or on Mars may not be as far as many imagine.
7   HeadSet   ignore (2)   2019 Feb 22, 2:37pm     ↓ dislike (0)   quote   flag      

couldn't dream of the stuff we have in a smartphone.

Yep, we beat the soup out of the Star Trek "communicator" but nothing close to the "2001 Space Odyssey" space travel or even the Astro-Boy flying cars and android robots of the Year 2000. Not even a ray gun.
8   FortWayneAsNancyPelosiHaircut   ignore (4)   2019 Feb 22, 2:49pm     ↓ dislike (0)   quote   flag      

HeadSet says
couldn't dream of the stuff we have in a smartphone.

Yep, we beat the soup out of the Star Trek "communicator" but nothing close to the "2001 Space Odyssey" space travel or even the Astro-Boy flying cars and android robots of the Year 2000. Not even a ray gun.

That kind of tech requires scientific break through. When we went from rotational force to jet propulsion, but much bigger.

The cab ride in the 5th Element film was hella futuristic, but so far away.
9   Rb6d   ignore (0)   2019 Feb 22, 3:00pm     ↓ dislike (0)   quote   flag      

APOCALYPSEFUCKisShostikovitch says
Amazon to datarape you

Luckily, Bezos got dataraped himself
10   Rb6d   ignore (0)   2019 Feb 22, 3:02pm     ↓ dislike (0)   quote   flag      

Heraclitusstudent says
The next 10 yrs will see similar astounding changes coming from the biotechnology sector.

It's already happening in biotech and biochemistry, but we surely will see much, much more. Now even some 4th stage cancers can be treated with immune therapies. Perhaps in 20 yrs nearly all cancers could be cured.
11   FortWayneAsNancyPelosiHaircut   ignore (4)   2019 Feb 22, 3:10pm     ↓ dislike (0)   quote   flag      

d6rB says
Heraclitusstudent says
The next 10 yrs will see similar astounding changes coming from the biotechnology sector.

It's already happening in biotech and biochemistry, but we surely will see much, much more. Now even some 4th stage cancers can be treated with immune therapies. Perhaps in 20 yrs nearly all cancers could be cured.

hope so
12   MisdemeanorRebel   ignore (3)   2019 Feb 22, 6:43pm     ↓ dislike (0)   quote   flag      

rocketjoe79 says
"A retrospective risk analysis by the Space Shuttle Safety and Mission Assurance Office finds that the first nine shuttle flights had a 1 in 9 chance of catastrophic failure — 10 times the risk of flights today."

In the original study, Oppenheimer himself wrote that one flight of the shuttle was fairly safe, but the devil was in future reuse. He was absolutely right. There were only 130 something shuttle trips, with two failures. However, the Shuttle Program was purchased with the guarantee of a 2, maybe 3 month turnaround per Shuttle (each Shuttle available for 4-6 flights per year), which never happened.
13   MisdemeanorRebel   ignore (3)   2019 Feb 22, 6:45pm     ↓ dislike (0)   quote   flag      

Heraclitusstudent says
We just went through 3 wrenching decades of changes on the information technology front, and people in the 60s who imagined bases on the moon and fusion power couldn't dream of the stuff we have in a smartphone.

Exactly. It was the telecomm stuff nobody thought about anywhere near as much as fusion or space travel or automated cars, yet it was the telecom and microcomputing that transformed the world.

Similarly, it's the DNA revolution of the 70s that will, I agree, produce the most benefit over the next 20 years.

We literally have the Dick Tracey watches. Hell, we have wireless appliances, from cameras to fridges to HVAC.
15   anonymous   ignore (null)   2019 Mar 1, 5:59am     ↓ dislike (0)   quote   flag      

In Tech Race With China, U.S. Universities May Lose a Vital Edge - They get the most patents - but not in key areas like AI, 5G

Researchers say Trump’s order on AI needs financial follow-up (not long winded self congratulating rhetoric from Potus)

The U.S. is still out in front of global rivals when it comes to innovation, but American universities –- where new ideas often percolate –- have reason to look over their shoulder.

That’s especially true for technologies like 5G phone networks and artificial intelligence. They’re exactly the fields where President Donald Trump recently insisted the U.S. has to lead -- and also the ones where Asia, especially China, has caught up.

Universities from China, Korea and Taiwan get more patents than their U.S. peers in wireless communications, according to research firm GreyB Services. In AI, 17 of the top 20 universities and public research organizations are in China, with the Chinese Academy of Sciences topping the list, says the World Intellectual Property Organization in Geneva. Overall, American universities still dominate the patent rankings, led by the University of California and Massachusetts Institute of Technology.

There’s a special place for universities in the ecosystem of research.

Corporate labs tend to focus on what they’re fairly sure will be profitable, while their government equivalents put national security first. Universities groom future scientist and can be incubators for pie-in-the-sky ideas –- some of which turn out to be game-changers. The list ranges from Google’s search engine to DNA technology that’s behind a whole industry of gene-manipulating treatments. Plus the Honeycrisp apple.

Government grants to universities have been stagnant for more than a decade, meaning they’ve declined in real terms and as a share of the economy.

If you look at the federal dollars, they’ve not really changed substantially,’’ says Stephen Susalka, head of AUTM, a technology transfer association whose members include 800 universities. “Other countries are catching up. We can’t sit on our laurels.’’

Federal funding of $40 billion for university research in fiscal 2017 was slightly below its peak six years earlier. The colleges spend about $75 billion a year altogether, with the balance largely coming from their own funds. The government’s share has slipped below 60 percent, from almost 70 percent in 2004.

Trump has proclaimed AI and 5G to be high priorities, but hasn’t pitched Congress for more money. In fact, last year the administration called for deep cuts in research funding, including an 11 percent hit to the National Science Foundation.

Congress balked and instead passed the biggest increase for a decade, bringing the total to more than $176 billion. How much will go to universities remains unclear, because the grant process was interrupted by a 35-day government shutdown.

More than half of federal cash for university research comes from the Department of Health and Human Services. That’s reflected in the types of patents U.S. universities are getting. Almost three quarters are in the life sciences, compared with less than half at Asian universities, according to intellectual property-management software firm Anaqua.

‘Should Be More’

Grants for IT research tend to originate at the Pentagon and the NSF, which each contribute about 13 percent of university funding.

“That’s not nothing,’’ says Doug Brake at the Information Technology & Innovation Foundation in Washington. “But I’d argue there should be more,’’ he says. “It pales in comparison to the type of support the Chinese engage in.’’

Comparisons are tricky, but by some measures China’s spending on research and development now rivals America’s.

The Chinese government also is investing in nearly 100 U.S. universities, which have Confucius Institutes to promote Chinese language and culture. None of those schools receive direct U.S. federal funding, according to the Government Accountability Office .

Another way to look at innovation in the world’s two biggest economies, and how they commercialize it, is to study payments made for the use of intellectual property – like patents, trademarks or copyrights. Here again, the U.S. still holds a lead but China is advancing.

The American way of bringing universities into this process has been widely emulated. A 1980 law allows them to keep patents that stem from government-funded research. Universities received more than $3 billion in gross licensing income in 2017, according to AUTM. They filed more than 15,000 patent applications, and helped create 1,080 start-ups.

Walter Copan, head of the National Institute of Standards and Technology, says the system is being updated so that research can be delivered more efficiently into the hands of business.

The government’s job “is to invest in these high-risk exploratory areas,’’ he says. “This is of critical importance to U.S. competitiveness.’’

‘Great to See’

That’s pretty much what Trump concluded about AI in his executive order last month. China isn’t identified by name in the 2,700-word document. But the references to maintaining America’s “economic and national security,’’ and protecting its tech from “attempted acquisition by strategic competitors,’’ point clearly in that direction.

Money doesn’t get much of a mention, though. At the University of California in Berkeley, there’s significantly more high-quality work going on than there is cash available, says Randy Katz, the vice chancellor for research. Only about one in five proposals deemed to have merit ends up getting funded.

That’s why Trump’s AI order needs some practical follow-up, says Katz. “It’s great to see it’s a national priority,” he says. “It’s up to Congress to see how much money is going to be spent.’’

Top 5 Universities Working in Wireless Communication:

Rank - University - Country - Patents Filed

1 Electronic and Telecommunications Research Institute (ETRI South Korea 2,692

2 Beijing University China 891

3 Industrial Tech. Research Institute Taiwan 704

4 Chongqing University China 694

5 Xidian University China 635

NOTE: Scalable Vector Graphics in Link Below

16   anonymous   ignore (null)   2019 Mar 10, 3:16pm     ↓ dislike (0)   quote   flag      

A new 2-D material uses light to quickly and safely purify water

In tests, it killed 99.9999 percent of the bacteria in contaminated water

Using light, a prototype “green” material can purify enough daily drinking water for four people in just one hour. In tests, it killed nearly 100 percent of bacteria in 10 liters of water, researchers report February 7 in Chem.

This new material,a 2-D sheet of graphitic carbon nitride, is a photocatalyst:? It releases electrons when illuminated to create destructive oxygen-based chemicals that destroy microbes. The design avoids pitfalls of other similar technology. Today’s most effective photocatalysts contain metals that can leach into water as toxic pollutants. Others are non-metallic, like the new 2-D sheets, but are less efficient because they hold onto electrons more tightly.

Materials scientist Guoxiu Wang of the University of Technology Sydney and colleagues created ultrathin sheets of graphitic carbon nitride and added chemical groups like acids and ketones that lure electrons toward the sheets’ edges. There, the electrons jump onto oxygen atoms in water to form such microbe-dissolving oxygen chemicals as hydrogen peroxide.

The design killed 99.9999 percent of bacteria, including E. coli, in a 50-milliliter water sample. That’s as efficient as the best metal-based catalyst. And it killed microbes more quickly than previous best metal-free photocatalysts, which take over an hour to achieve what the new design did in 30 minutes.

The team then attached the nanosheets to the inside surface of plastic bags, purifying 10 liters of water in an hour.

“Our motive was to develop an efficient way to use sunlight to produce water for undeveloped or remote regions without a central supply of clean water,” Wang says, noting that the carbon and nitrogen composition should make the material inexpensive. The researchers next aim to work with engineers to scale up the design for commercial use.


Note: Can't get the link to work but clicking on the highlighted section that is in the original link will take you to the next site with technical details.

Related Reading: Edge-Functionalized g-C3N4 Nanosheets as a Highly Efficient Metal-free Photocatalyst for Safe Drinking Water

Edge-Functionalized g-C3N4 Nanosheets as a Highly Efficient Metal-free Photocatalyst for Safe Drinking Water
17   anonymous   ignore (null)   2019 Mar 20, 5:13am     ↓ dislike (0)   quote   flag      

Technology: From Copycats to Innovators

In its earliest years, the United States took ideas from Britain, the birthplace of the Industrial Revolution and the keeper of those manufacturing secrets that had fueled its powerhouse textile industry. By 1872, America’s economy had surpassed that of Britain, and it soon became the world’s most creative industrial power, with eager scouts from other shores trekking to its factories. By 1918, America’s economy was larger than that of Britain, France, and Germany combined.

China, in the small handful of decades since it embraced capitalism, has aggressively sought to acquire intellectual property (IP) from the rest of the world. Witness just one unhappy Wall Street Journal headline: “How China Systematically Pries Technology From U.S. Companies: Beijing leans on an array of levers to extract intellectual property—sometimes coercively.”

Some commentators have accused China of stealing intellectual property, and characterized theirs as a copycat culture not well suited for innovation. In 2011, the journalist Alexandra Harney wrote that:

The harder Beijing pushes its companies and scientists to come up with new ideas, the more they seem to copy the work of others. . . . In a nation with such breakneck economic growth and an overburdened judicial system the dishonest frequently win. The system to protect the honest simply isn’t robust enough. Dishonest copiers move quickly to secure an advantage in a rapidly growing market, and their success, in turn, perpetuates China’s copycat culture.

In 2016, Peter Guy, a financial writer and former international banker, argued that “copying and reverse engineering [in China] accelerated new product launches, but eroded China’s competitiveness. Stealing intellectual property has enormously benefited Chinese companies. But, it has crippled their ability to develop the next version or innovate. Just look at almost every important technological innovation in hardware, software, and the internet. Without sounding arrogant or imperialistic, they are with few exceptions made in America.”

China has assiduously and relentlessly copied others. In fact, in joint ventures that U.S. companies have entered into with Chinese companies, and in investments Chinese companies have made in U.S. and other foreign companies, the Chinese have often made the turnover of IP to China a requirement.

Yet China is doing far more than copy, steal, extract, and imitate. It is hell bent on taking over the world’s leadership in innovation, and is well on track to accomplish this. China’s President Xi Jinping has made indigenous innovation a national priority, and technology innovation and advancement are integral to China’s current five-year plan. In its 2006 National Medium-and Long-Term Program for Science and Technology Development, and in the speeches of its government leaders, China has made clear its goal of becoming an “‘innovative nation’ by 2020, an international leader in innovation by 2030, and a world powerhouse of scientific and technological innovation by 2050.” The business press is beginning to note this shift, as with CNN reporter Matt Rivers’s 2018 article, “Inside China’s Silicon Valley: From copycats to innovation.”

More: https://democracyjournal.org/magazine/52/technology-from-copycats-to-innovators/

about   best comments   contact   one year ago   suggestions