Tuesday, May 7, 2024

Counting times on World Catastrophic Clock – It’s seen as not upright for now and the future. Individual action is required while humans still can

Time is the continued sequence of existence and events that occurs in an apparently irreversible succession from the past, through the present, and into the future. It is a component quantity of various measurements used to sequence events, to compare the duration of events or the intervals between them, and to quantify rates of change of quantities in material reality or in the conscious experience. Time is often referred to as a fourth dimension, along with three spatial dimensions.

Time in physics is operationally defined as "what a clock reads". This operational definition of time, wherein one says that observing a certain number of repetitions of one or another standard cyclical event constitutes one standard unit, such as the second, is useful in the conduct of both advanced experiments and everyday affairs of life. There are many systems for determining what time it is. Periodic events and periodic motion have long served as standards for units of time. Examples include the apparent motion of the sun across the sky, the phases of the moon, and the passage of a free-swinging pendulum. More modern systems include the Global Positioning System, other satellite systems, Coordinated Universal Time, and mean solar time. In general, the numbers obtained from different time systems differ from one another, but with careful measurements, they can be synchronized.

A clock or chronometer is a device that measures and displays time. The clock is one of the oldest human inventions, meeting the need to measure intervals of time shorter than the natural units such as the day, the lunar month, and the year. Devices operating on several physical processes have been used over the millennia. Clocks can be classified by the type of time display, as well as by the method of timekeeping.

Clocks have different ways of displaying the time. Analog clocks indicate time with a traditional clock face and moving hands. Digital clocks display a numeric representation of time. Two numbering systems are in use: 12-hour time notation and 24-hour notation. Most digital clocks use electronic mechanisms and LCD, LED, or VFD displays. For the blind and for use over telephones, speaking clocks state the time audibly in words. There are also clocks for the blind that have displays that can be read by touch.

Specific types of Clock

Doomsday Clock

The Doomsday Clock is a symbol that represents the likelihood of a human-made global catastrophe, in the opinion of the members of the Bulletin of the Atomic Scientists. Maintained since 1947, the clock is a metaphor, not a prediction, for threats to humanity from unchecked scientific and technological advances. That is, the time on the clock is not to be interpreted as actual time. A hypothetical global catastrophe is represented by midnight on the clock, with the Bulletin's opinion on how close the world is to one represented by a certain number of minutes or seconds to midnight, which is then assessed in January of each year. The main factors influencing the clock are nuclear warfare, climate change, and artificial intelligence. The Bulletin's Science and Security Board monitors new developments in the life sciences and technology that could inflict irrevocable harm to humanity.

The clock's original setting in 1947 was 7 minutes to midnight. It has since been set backward 8 times and forward 17 times. The farthest time from midnight was 17 minutes in 1991, and the nearest is 90 seconds, set on January 2023.

The clock was moved to 150 seconds (2 minute, 30 seconds) in 2017, then forward to 2 minutes to midnight in January 2018, and left unchanged in 2019. In January 2020, it was moved forward to 100 seconds (1 minute, 40 seconds) before midnight. In January 2023, the Clock was moved forward to 90 seconds (1 minute, 30 seconds) before midnight and remained unchanged in January 2024.

Basis for settings

"Midnight" has a deeper meaning besides the constant threat of war. There are various elements taken into consideration when the scientists from The Bulletin of the Atomic Scientists decide what Midnight and "global catastrophe" really mean in a particular year. They might include "politics, energy, weapons, diplomacy, and climate science"; potential sources of threat include nuclear threats, climate change, bioterrorism, and artificial intelligence. Members of the board judge Midnight by discussing how close they think humanity is to the end of civilization. In 1947, at the beginning of the Cold War, the Clock was started at seven minutes to midnight.

Fluctuations and threats

Before January 2020, the two tied-for-lowest points for the Doomsday Clock were in 1953 (when the Clock was set to two minutes until midnight, after the U.S. and the Soviet Union began testing hydrogen bombs) and in 2018, following the failure of world leaders to address tensions relating to nuclear weapons and climate change issues. In other years, the Clock's time has fluctuated from 17 minutes in 1991 to 2 minutes 30 seconds in 2017. Discussing the change to 2+1/2 minutes in 2017, the first use of a fraction in the Clock's history, Lawrence Krauss, one of the scientists from the Bulletin, warned that political leaders must make decisions based on facts, and those facts "must be taken into account if the future of humanity is to be preserved". In an announcement from the Bulletin about the status of the Clock, they went as far to call for action from "wise" public officials and "wise" citizens to make an attempt to steer human life away from catastrophe while humans still can.

On January 24, 2018, scientists moved the clock to two minutes to midnight, based on threats greatest in the nuclear realm. The scientists said, of recent moves by North Korea under Kim Jong-un and the administration of Donald Trump in the U.S.: "Hyperbolic rhetoric and provocative actions by both sides have increased the possibility of nuclear war by accident or miscalculation".


The clock was left unchanged in 2019 due to the twin threats of nuclear weapons and climate change, and the problem of those threats being "exacerbated this past year by the increased use of information warfare to undermine democracy around the world, amplifying risk from these and other threats and putting the future of civilization in extraordinary danger".

On January 23, 2020, the Clock was moved to 100 seconds (1 minute, 40 seconds) before midnight. The Bulletin's executive chairman, Jerry Brown, said "the dangerous rivalry and hostility among the superpowers increases the likelihood of nuclear blunder... Climate change just compounds the crisis". The "100 seconds to midnight" setting remained unchanged in 2021 and 2022.

On January 24, 2023, the Clock was moved to 90 seconds (1 minute, 30 seconds) before midnight, the closest it has ever been set to midnight since its inception in 1947. This adjustment was largely attributed to the risk of nuclear escalation that arose from the Russian invasion of Ukraine. Other reasons cited included climate change, biological threats such as COVID-19, and risks associated with disinformation and disruptive technologies.



Climate Clock

The Climate Clock is a graphic to demonstrate how quickly the planet is approaching 1.5 °C of global warming, given current emissions trends. It also shows the amount of CO2 already emitted, and the global warming to date.

The Climate Clock was launched in 2015 to provide a measuring stick against which viewers can track climate change mitigation progress. The date shown when humanity reaches 1.5°C will move closer as emissions rise, and further away as emissions decrease. An alternative view projects the time remaining to 2.0°C of warming. The clock is updated every year to reflect the latest global CO2 emissions trend and rate of climate warming. As of April 2, 2024, the clock counts down to July 21, 2029 at 12:00 PM. On September 20, 2021, the clock was delayed to July 28, 2028, likely because of the COP26 Conference and the land protection by indigenous peoples.

The clock is hosted by Human Impact Lab, itself part of Concordia University. Organisations supporting the climate clock include Concordia University, the David Suzuki Foundation, Future Earth, and the Climate Reality Project.

As of April 29, 2024, the current climate temperature is 1.297°C

Relevance

1.5 °C is an important threshold for many climate impacts, as shown by the Special Report on Global Warming of 1.5 °C. Every increment to global temperature is expected to increase weather extremes, such as heat waves and extreme precipitation events. There is also the risk of irreversible ice sheet loss. Consequent sea level rise also increases sharply around 1.75 °C, and virtually all corals could be wiped out at 2 °C warming.

The New York Climate Clock

In late September 2020, artists and activists, Gan Golan, Katie Peyton Hofstadter, Adrian Carpenter and Andrew Boyd repurposed the Metronome in Union Square in New York City to show the Climate Clock. The goal was to "remind the world every day just how perilously close we are to the brink." This is in juxtaposition to the Doomsday Clock, which measures a variety of factors that could lead to "destroying the world" using "dangerous technologies of our making," with climate change being one of the smaller factors. This specific installation is expected to be one of many in cities around the world. At the time of installation, the clock read 7 years and 102 days. Greta Thunberg, Swedish environmental activist, was involved in the project early on, and reportedly received a hand-held version of the climate clock.

Since its inception, the New York Climate Clock has added a second set of numbers for the percentage of the world's energy use that comes from renewable energy sources.

Monday, April 15, 2024

Radiocaesium fallout and released in the environment is a real culprit behind the increasing ocean heat temperature

Caesium (IUPAC spelling; cesium in American English) is a chemical element; it has symbol Cs and atomic number 55. It is a soft, silvery-golden alkali metal with a melting point of 28.5 °C (83.3 °F; 301.6 K), which makes it one of only five elemental metals that are liquid at or near room temperature. Caesium has physical and chemical properties similar to those of rubidium and potassium. It is pyrophoric and reacts with water even at −116 °C (−177 °F). It is the least electronegative element, with a value of 0.79 on the Pauling scale. It has only one stable isotope, caesium-133. Caesium is mined mostly from pollucite.

Caesium-137, a fission product, is extracted from waste produced by nuclear reactors. It has the largest atomic radius of all elements whose radii have been measured or calculated, at about 260 picometers. It has a melting point of 28.5 °C (83.3 °F), making it one of the few elemental metals that are liquid near room temperature. Mercury is the only stable elemental metal with a known melting point lower than caesium.

Caesium-137 (13755Cs), cesium-137 (US), or radiocaesium, is a radioactive isotope of caesium that is formed as one of the more common fission products by the nuclear fission of uranium-235 and other fissionable isotopes in nuclear reactors and nuclear weapons. Trace quantities also originate from spontaneous fission of uranium-238. It is among the most problematic of the short-to-medium-lifetime fission products. Caesium-137 has a relatively low boiling point of 671 °C (1,240 °F) and easily becomes volatile when released suddenly at high temperature, as in the case of the Chernobyl nuclear accident and with atomic explosions, and can travel very long distances in the air. After being deposited onto the soil as radioactive fallout, it moves and spreads easily in the environment because of the high water solubility of caesium's most common chemical compounds, which are salts.

Caesium-137 reacts with water, producing a water-soluble compound (caesium hydroxide). The biological behavior of caesium is similar to that of potassium and rubidium.
Caesium-137, along with other radioactive isotopes caesium-134, iodine-131, xenon-133, and strontium-90, were released into the environment during nearly all nuclear weapon tests and some nuclear accidents, most notably the Chernobyl disaster and the Fukushima Daiichi disaster.

Caesium-137 in the environment is substantially anthropogenic (human-made), these bellwether isotopes are produced solely from anthropogenic sources. Caesium-137 is produced from the nuclear fission of plutonium and uranium, and decays into barium-137.

Nuclear isotope and safety hazards
Caesium-137 is a radioisotope commonly used as a gamma-emitter in industrial applications. Its advantages include a half-life of roughly 30 years, its availability from the nuclear fuel cycle, and having 137Ba as a stable end product. It has been used in agriculture, cancer treatment, and the sterilization of food, sewage sludge, and surgical equipment. Radioactive isotopes of caesium in radiation devices were used in the medical field to treat certain types of cancer, but emergence of better alternatives and the use of water-soluble caesium chloride in the sources, which could create wide-ranging contamination, gradually put some of these caesium sources out of use. Caesium-137 has been employed in a variety of industrial measurement gauges, including moisture, density, leveling, and thickness gauges. It has also been used in well-logging devices for measuring the electron density of the rock formations, which is analogous to the bulk density of the formations

The isotopes 134 and 137 are present in the biosphere in small amounts from human activities, differing by location. Radiocaesium does not accumulate in the body as readily as other fission products (such as radioiodine and radiostrontium). About 10% of absorbed radiocaesium washes out of the body relatively quickly in sweat and urine. The remaining 90% has a biological half-life between 50 and 150 days. Radiocaesium follows potassium and tends to accumulate in plant tissues, including fruits and vegetables. Plants vary widely in the absorption of caesium, sometimes displaying great resistance to it. It is also well-documented that mushrooms from contaminated forests accumulate radiocaesium (caesium-137) in the fungal sporocarps. Accumulation of caesium-137 in lakes has been a great concern after the Chernobyl disaster. Experiments with dogs showed that a single dose of 3.8 millicuries (140 MBq, 4.1 μg of caesium-137) per kilogram is lethal within three weeks; smaller amounts may cause infertility and cancer. The International Atomic Energy Agency and other sources have warned that radioactive materials, such as caesium-137, could be used in radiological dispersion devices, or "dirty bombs".

Fukushima Daiichi disaster
In April 2011, elevated levels of caesium-137 were also found in the environment after the Fukushima Daiichi nuclear disasters in Japan. In July 2011, meat from 11 cows shipped to Tokyo from Fukushima Prefecture was found to have 1,530 to 3,200 becquerels per kilogram of 137Cs, considerably exceeding the Japanese legal limit of 500 becquerels per kilogram at that time. In March 2013, a fish caught near the plant had a record 740,000 becquerels per kilogram of radioactive caesium, above the 100 becquerels per kilogram government limit. A 2013 paper in Scientific Reports found that for a forest site 50 km from the stricken plant, 137Cs concentrations were high in leaf litter, fungi, and detritivores, but low in herbivores. By the end of 2014, "Fukushima-derived radiocaesium had spread into the whole western North Pacific Ocean", transported by the North Pacific current from Japan to the Gulf of Alaska. It has been measured in the surface layer down to 200 meters and south of the current area down to 400 meters.

Radioactive materials were dispersed into the atmosphere immediately after the disaster and account for most of all such materials leaked into the environment. 80% of the initial atmospheric release eventually deposited over rivers and the Pacific Ocean, according to a UNSCEAR report in 2020. Specifically, "the total releases to the atmosphere of Iodine-131 and Caesium-137 ranged generally between about 100 to about 500 PBq [petabecquerel, 1015 Bq] and 6 to 20 PBq, respectively.

Once released into the atmosphere, those that remain in a gaseous phase will simply be diluted by the atmosphere, but some that precipitate will eventually settle on land or in the ocean. Thus, the majority (90~99%) of the radionuclides which are deposited are isotopes of iodine and caesium, with a small portion of tellurium, which is almost fully vaporized out of the core due to their low vapor pressure. The remaining fraction of deposited radionuclides are of less volatile elements such as barium, antimony, and niobium, of which less than a percent is evaporated from the fuel.

Approximately 40–80% of the atmospheric releases were deposited over the ocean.

In addition to atmospheric deposition, there was also a significant quantity of direct releases into groundwater (and eventually the ocean) through leaks of coolant that had been in direct contact with the fuel. Estimates for this release vary from 1 to 5.5 PBq. Although the majority had entered the ocean shortly following the accident, a significant fraction remains in the groundwater and continues to mix with coastal waters.

According to the French Institute for Radiological Protection and Nuclear Safety, the release from the accident represents the most important individual oceanic emissions of artificial radioactivity ever observed. The Fukushima coast has one of the world's strongest currents (Kuroshio Current). It transported the contaminated waters far into the Pacific Ocean, dispersing the radioactivity. As of late 2011 measurements of both the seawater and the coastal sediments suggested that the consequences for marine life would be minor.

Significant pollution along the coast near the plant might persist, because of the continuing arrival of radioactive material transported to the sea by surface water crossing contaminated soil.
The possible presence of other radioactive substances, such as
strontium-90 or plutonium, has not been sufficiently studied. Recent measurements show persistent contamination of some marine species (mostly fish) caught along the Fukushima coast.

Initial discharge
A large amount of caesium entered the sea from the initial atmospheric release. By 2013, the concentrations of caesium-137 in the Fukushima coastal waters were around the level before the accident. However, concentrations in coastal sediments declined more slowly than in coastal waters, and the amount of caesium-137 stored in sediments most likely exceeded that in the water column by 2020. The sediments may provide a long-term source of caesium-137 in the seawater.

Data on marine foods indicates their radioactive concentrations are falling towards initial levels. 41% of samples caught off the Fukushima coast in 2011 had caesium-137 concentrations above the legal limit (100 becquerels per kilogram), and this had declined to 0.05% in 2015. United States Food and Drug Administration stated in 2021 that "FDA has no evidence that radionuclides from the Fukushima incident are present in the U.S. food supply at levels that are unsafe". Yet, presenting the science alone has not helped customers to regain their trust in eating Fukushima fishery products.

2023 discharge
The most prevalent radionuclide in the wastewater is tritium. A total of 780 terabecquerels (TBq) will be released into the ocean at a rate of 22 TBq per year. Tritium is routinely released into the ocean from operating nuclear power plants, sometimes in much greater quantities. For comparison, the La Hague nuclear processing site in France released 11,400 TBq of tritium in the year of 2018. In addition, about 60,000 TBq of tritium is produced naturally in the atmosphere each year by cosmic rays.

Other radionuclides present in the wastewater, like caesium-137, are not normally released by nuclear power plants. However, the concentrations in the treated water are minuscule relative to regulation limits.

"There is consensus among scientists that the impact on health is minuscule, still, it can't be said the risk is zero, which is what causes controversy", Michiaki Kai, a Japanese nuclear expert, told AFP. David Bailey, a physicist whose lab measures radioactivity, said that with tritium at diluted concentrations, "there is no issue with marine species unless we see a severe decline in fish population".

Ferenc Dalnoki-Veress, a scientist-in-residence at the Middlebury Institute of International Studies at Monterey, said regarding dilution that bringing in living creatures makes the situation more complex. Robert Richmond, a biologist from the University of Hawaiʻi, told the BBC that the inadequate radiological and ecological assessment raises the concern that Japan would be unable to detect what enters the environment and "get the genie back in the bottle". Dalnoki-Veress, Richmond, and three other panelists consulting for the Pacific Islands Forum wrote that dilution may fail to account for bioaccumulation and exposure pathways that involve organically bound tritium (OBT).

Tuesday, March 5, 2024

Sun activity is not caused global warming, but by humans yes it is. Continued stress on greenhouse gases can significantly increase global high temperatures

Patterns of solar irradiance and solar variation have been the main drivers of climate change over the millions to billions of years of the geologic time scale.

Evidence that this is the case comes from analysis on many timescales and from many sources, including direct observations; composites from baskets of different proxy observations; and numerical climate models. On millennial timescales, paleoclimate indicators have been compared to cosmogenic isotope abundances as the latter are a proxy for solar activity. These have also been used on century times scales but, in addition, instrumental data are increasingly available (mainly telescopic observations of sunspots and thermometer measurements of air temperature) and show that, for example, the temperature fluctuations do not match the solar activity variations and that the commonly-invoked association of the Little Ice Age with the Maunder minimum is far too simplistic as, although solar variations may have played a minor role, a much bigger factor is known to be Little Ice Age volcanism. In recent decades observations of unprecedented accuracy, sensitivity, and scope (of both solar activity and terrestrial climate) have become available from spacecraft and show unequivocally that recent global warming is not caused by changes in the Sun.

Since 1978, solar irradiance has been directly measured by satellites with very good accuracy. These measurements indicate that the Sun's total solar irradiance fluctuates by +-0.1% over the ~11 years of the solar cycle, but that its average value has been stable since the measurements started in 1978. Solar irradiance before the 1970s is estimated using proxy variables, such as tree rings, the number of sunspots, and the abundances of cosmogenic isotopes such as 10Be, all of which are calibrated to the post-1978 direct measurements.

Solar activity has been on a declining trend since the 1960s, as indicated by solar cycles 19–24 (current solar cycles 25), in which the maximum number of sunspots were 201, 111, 165, 159, 121, and 82, respectively. In the three decades following 1978, the combination of solar and volcanic activity is estimated to have had a slight cooling influence. A 2010 study found that the composition of solar radiation might have changed slightly, with in an increase of ultraviolet radiation and a decrease in other wavelengths."

Solar variation theory

The link between recent solar activity and climate has been quantified and is not a major driver of the warming that has occurred since early in the twentieth century. Human-induced forcings are needed to reproduce the late-20th-century warming.

A 1994 study by the US National Research Council concluded that TSI variations were the most likely cause of significant climate change in the pre-industrial era before significant human-generated carbon dioxide entered the atmosphere.

Scafetta and West correlated solar proxy data and lower tropospheric temperature for the preindustrial era, before significant anthropogenic greenhouse forcing, suggesting that TSI variations may have contributed 50% of the warming observed between 1900 and 2000 (although they conclude "our estimates about the solar effect on climate might be overestimated and should be considered as an upper limit.") If interpreted as a detection rather than an upper limit, this would contrast with global climate models predicting that solar forcing of climate through direct radiative forcing makes an insignificant contribution.

In 2000, Stott and others reported on the most comprehensive model simulations of 20th-century climate to that date. Their study looked at both "natural forcing agents" (solar variations and volcanic emissions) as well as "anthropogenic forcing" (greenhouse gases and sulphate aerosols). They found that "solar effects may have contributed significantly to the warming in the first half of the century although this result is dependent on the reconstruction of total solar irradiance that is used. In the latter half of the century, we find that anthropogenic increases in greenhouse gases are largely responsible for the observed warming, balanced by some cooling due to anthropogenic sulphate aerosols, with no evidence for significant solar effects." Stott's group found that combining these factors enabled them to closely simulate global temperature changes throughout the 20th century. They predicted that continued greenhouse gas emissions would cause additional future temperature increases "at a rate similar to that observed in recent decades". In addition, the study notes "uncertainties in historical forcing" — in other words, past natural forcing may still be having a delayed warming effect, most likely due to the oceans.

Stott's 2003 work largely revised his assessment, and found a significant solar contribution to recent warming, although still smaller (between 16 and 36%) than that of greenhouse gases.

A study in 2004 concluded that solar activity affects the climate - based on sunspot activity, yet plays only a small role in the current global warming.

Human activities increased Anthropogenic Global Warming and temperature change

Greenhouse gases, such as CO2, methane, and nitrous oxide, heat the climate system by trapping infrared light. Volcanoes are also part of the extended carbon cycle. Since the industrial revolution, humanity has been adding to greenhouse gases by emitting CO2 from fossil fuel combustion, changing land use through deforestation, and further altering the climate with aerosols (particulate matter in the atmosphere), release of trace gases (e.g. nitrogen oxides, carbon monoxide, or methane). Other factors, including land use, ozone depletion, animal husbandry (ruminant animals such as cattle produce methane), and deforestation, also play a role.

The US Geological Survey estimates that volcanic emissions are at a much lower level than the effects of current human activities, which generate 100–300 times the amount of carbon dioxide emitted by volcanoes. The annual amount put out by human activities may be greater than the amount released by supereruptions.

As a consequence of humans emitting greenhouse gases, global surface temperatures have started rising. Global warming is an aspect of modern climate change, a term that also includes the observed changes in precipitation, storm tracks, and cloudiness. As a consequence, glaciers worldwide have been found to be shrinking significantly. Land ice sheets in both Antarctica and Greenland have been losing mass since 2002 and have seen an acceleration of ice mass loss since 2009. Global sea levels have been rising as a consequence of thermal expansion and ice melt. The decline in Arctic sea ice, both in extent and thickness, over the last several decades is further evidence of rapid climate change.

Changes in global temperatures over the past century provide evidence for the effects of increasing greenhouse gasses. When the climate system reacts to such changes, climate change follows. Measurement of the GST (global surface temperature) is one of the many lines of evidence supporting the scientific consensus on climate change, which is that humans are causing warming of Earth's climate system.

The climate system receives nearly all of its energy from the sun and radiates energy to outer space. The balance of incoming and outgoing energy and the passage of energy through the climate system is Earth's energy budget. When the incoming energy is greater than the outgoing energy, Earth's energy budget is positive and the climate system is warming. If more energy goes out, the energy budget is negative and Earth experiences cooling.

Global warming affects all parts of Earth's climate system. Global surface temperatures have risen by 1.1 °C (2.0 °F). Scientists say they will rise further in the future. The changes in climate are not uniform across the Earth. In particular, most land areas have warmed faster than most ocean areas. The Arctic is warming faster than most other regions. Night-time temperatures have increased faster than daytime temperatures. The impact on nature and people depends on how much more the Earth warms.

Scientists use several methods to predict the effects of human-caused climate change. One is to investigate past natural changes in climate. To assess changes in Earth's past climate scientists have studied tree rings, ice cores, corals, and ocean and lake sediments. These show that recent temperatures have surpassed anything in the last 2,000 years. By the end of the 21st century, at that time, mean global temperatures were about 2–4 °C (3.6–7.2 °F) warmer than pre-industrial temperatures. The modern observed rise in temperature and CO2 concentrations has been rapid, even abrupt geophysical events in Earth's history do not approach current rates.

Land and Oceans are rapidly affected by Climate change

Climate change affects the physical environment, ecosystems, and human societies. Changes in the climate system include an overall warming trend, more extreme weather, and rising sea levels. These in turn impact nature and wildlife, as well as human settlements and societies. The effects of human-caused climate change are broad and far-reaching. This is especially so if there is no significant climate action. Experts sometimes describe the projected and observed negative impacts of climate change as the climate crisis.

The changes in climate are not uniform across the Earth. In particular, most land areas have warmed faster than most ocean areas. The Arctic is warming faster than most other regions. There are many effects of climate change on oceans. These include an increase in ocean temperatures, a rise in sea level from ocean warming, and ice sheet melting. They include increased ocean stratification. They also include changes to ocean currents including a weakening of the Atlantic meridional overturning circulation. Carbon dioxide from the atmosphere is acidifying the ocean.

Recent warming has had a big effect on natural biological systems. It has degraded land by raising temperatures, drying soils, and increasing wildfire risk. Species all over the world are migrating towards the poles to colder areas. On land, many species move to higher ground, whereas marine species seek colder water at greater depths. At 2 °C (3.6 °F) of warming, around 10% of species on land would become critically endangered.

Saturday, February 3, 2024

Rugged, lower power and more resistant to physical shock of NAND flash-based SSDs is about to replace rotating-platters HDDs in a few years

A hard disk drive (HDD), is an electro-mechanical data storage device that stores and retrieves digital data using magnetic storage with one or more rigid rapidly rotating platters coated with magnetic material. The platters are made from a non-magnetic material, usually aluminum alloy, glass, or ceramic. The platters are paired with magnetic heads, usually arranged on a moving actuator arm, which read and write data to the platter surfaces. Data is accessed in a random-access manner, meaning that individual blocks of data can be stored and retrieved in any order. HDDs are a type of non-volatile storage, retains stored data when powered off. Modern HDDs are typically in the form of a small rectangular box.

The two most common form factors for modern HDDs are 3.5-inch, for desktop computers, and 2.5-inch, primarily for laptops. HDDs are connected to systems by standard interface cables such as PATA (Parallel ATA), SATA (Serial ATA), USB or SAS (Serial Attached SCSI) cables, and Fibre Channel.

Introduced by IBM in 1956, HDDs were the dominant secondary storage device for general-purpose computers beginning in the early 1960s. HDDs maintained this position in the modern era of servers and personal computers, though personal computing devices produced in large volumes, like mobile phones and tablets, rely on flash memory storage devices.

In the 2000s and 2010s, NAND flash-based SSDs began supplanting HDDs in applications requiring portability or high performance. NAND performance is improving faster than HDDs, and applications for HDDs are eroding. The highest-capacity HDDs shipping commercially in 2022 are 26 TB, while the largest capacity SSDs had a capacity of 100 TB. HDD unit shipments peaked at 651 million units in 2010 and have been declining since then to 166 million units in 2022.

Advantages of SSDs over Traditional spinning platters


A solid-state drive (SSD) is a solid-state storage device that uses integrated circuit assemblies to store data persistently, typically using flash memory and functions as secondary storage in the hierarchy of computer storage. It is also sometimes called a semiconductor storage device, a solid-state device, or a solid-state disk, even though SSDs lack the physical spinning disks and movable read-write heads used in hard disk drives (HDDs) and floppy disks. SSD also has rich internal parallelism for data processing. Solid-state drives (SSDs) have higher data-transfer rates, higher areal storage density, somewhat better reliability, and much lower latency and access times.

Flash-based SSDs store data in metal–oxide–semiconductor (MOS) integrated circuit chips which contain non-volatile floating-gate memory cells. Flash memory-based solutions are typically packaged in standard disk drive form factors (1.8-, 2.5-, and 3.5-inch), but also in smaller more compact form factors, such as the M.2 form factor, made possible by the small size of flash memory.

The key components of an SSD are the controller and the memory to store the data. The primary memory component in an SSD was traditionally DRAM volatile memory, but since 2009, it has been more commonly NAND flash non-volatile memory. Every SSD includes a controller that incorporates the electronics that bridge the NAND memory components to the host computer. The controller is an embedded processor that executes firmware-level code and is one of the most important factors of SSD performance.

In comparison to hard disk drives and similar electromechanical media which use moving parts, SSDs are typically more resistant to physical shock, run silently, and have higher input/output rates and lower latency. SSDs based on NAND flash will slowly leak charge over time if left for long periods without power. This causes worn-out drives (that have exceeded their endurance rating) to start losing data typically after one year (if stored at 30 °C) to two years (at 25 °C) in storage; for new drives, it takes longer. Therefore, SSDs are not suitable for archival storage. SSDs have a limited lifetime number of writes and also slow down as they reach their full storage capacity.

Due to the extremely close spacing between the heads and the disk surface, HDDs are vulnerable to being damaged by a head crash – a failure of the disk in which the head scrapes across the platter surface, often grinding away the thin magnetic film and causing data loss. Head crashes can be caused by electronic failure, a sudden power failure, physical shock, contamination of the drive's internal enclosure, wear and tear, corrosion, or poorly manufactured platters and heads.

Most of the advantages of solid-state drives over traditional hard drives are due to their ability to access data completely electronically instead of electromechanically, resulting in superior transfer speeds and mechanical ruggedness.

Flash memory as a replacement for hard drives

The size and shape of any device are largely driven by the size and shape of the components used to make that device. Traditional HDDs and optical drives are designed around the rotating platter(s) or optical disc along with the spindle motor inside. Since an SSD is made up of various interconnected integrated circuits (ICs) and an interface connector, its shape is no longer limited to the shape of rotating media drives. Some solid-state storage solutions come in a larger chassis that may even be a rack-mount form factor with numerous SSDs inside. They would all connect to a common bus inside the chassis and connect outside the box with a single connector. As of 2014, mSATA and M.2 form factors also gained popularity, primarily in laptops.

M.2 form factor, formerly known as the Next Generation Form Factor (NGFF), is a natural transition from the mSATA and physical layout it used, to a more usable and more advanced form factor. While mSATA took advantage of an existing form factor and connector, M.2 has been designed to maximize usage of the card space, while minimizing the footprint. The M.2 standard allows both SATA and PCI Express SSDs to be fitted onto M.2 modules. The SSD was designed to be installed permanently inside a computer.
 
Due to their generally prohibitive cost versus HDDs at the time, until 2009, SSDs were mainly used in those aspects of mission-critical applications where the speed of the storage system needed to be as high as possible. Since flash memory has become a common component of SSDs, the falling prices and increased densities have made it more cost-effective for many other applications. For instance, in the distributed computing environment, SSDs can be used as the building block for a distributed cache layer that temporarily absorbs the large volume of user requests to the slower HDD-based backend storage system. This layer provides much higher bandwidth and lower latency than the storage system and can be managed in a number of forms, such as distributed key-value databases and distributed file systems. On supercomputers, this layer is typically referred to as a burst buffer. With this fast layer, users often experience shorter system response times. Organizations that can benefit from faster access to system data include equity trading companies, telecommunication corporations, and streaming media and video editing firms. The list of applications that could benefit from faster storage is vast.

Flash-based solid-state drives can be used to create network appliances from general-purpose personal computer hardware. A write-protected flash drive containing the operating system and application software can substitute for larger, less reliable disk drives or CD-ROMs. Appliances built this way can provide an inexpensive alternative to expensive router and firewall hardware.

SSDs based on an SD card with a live SD operating system are easily write-locked. Combined with a cloud computing environment or other writable medium, to maintain persistence, an OS booted from a write-locked SD card is robust, rugged, reliable, and impervious to permanent corruption. If the running OS degrades, simply turning the machine off and then on returns it back to its initial uncorrupted state and thus is particularly solid. The SD card installed OS does not require removal of corrupted components since it was write-locked though any written media may need to be restored.

One source states that, in 2008, the flash memory industry included about US$9.1 billion in production and sales. Other sources put the flash memory market at a size of more than US$20 billion in 2006, accounting for more than eight percent of the overall semiconductor market and more than 34 percent of the total semiconductor memory market. In 2012, the market was estimated at $26.8 billion, it can take up to 10 weeks to produce a flash memory chip. Samsung remains the largest NAND flash memory manufacturer as of the first quarter 2022.

Technology assessment (TA, German: Technikfolgenabschätzung, French: Évaluation des choix scientifiques et technologiques) is a practical process of determining the value of a new or emerging technology in and of itself or against existing technologies. This is a means of assessing and rating the new technology from the time when it was first developed to the time when it is potentially accepted by the public and authorities for further use. In essence, TA could be defined as "a form of policy research that examines short- and long-term consequences (for example, societal, economic, ethical, legal) of the application of technology."

TA is the study and evaluation of new technologies. It is a way of trying to forecast and prepare for the upcoming technological advancements and their repercussions on society, and then make decisions based on the judgments. It is based on the conviction that new developments within, and discoveries by, the scientific community are relevant for the world at large rather than just for the scientific experts themselves and that technological progress can never be free of ethical implications. Also, technology assessment recognizes the fact that scientists normally are not trained ethicists themselves and accordingly ought to be very careful when passing ethical judgment on their own, or their colleagues, new findings, projects, or work in progress. TA is a very broad phenomenon that also includes aspects such as "diffusion of technology (and technology transfer), factors leading to rapid acceptance of new technology, and the role of technology and society."

Technology assessment assumes a global perspective and is future-oriented, not anti-technological. TA considers its task as an interdisciplinary approach to solving already existing problems and preventing potential damage caused by the uncritical application and commercialization of new technologies.

Therefore, any results of technology assessment studies must be published, and particular consideration must be given to communication with political decision-makers.
 
Google SEO sponsored by Red Dragon Electric Cigarette Products