Monday, April 15, 2024

Radiocaesium fallout and released in the environment is a real culprit behind the increasing ocean heat temperature

Caesium (IUPAC spelling; cesium in American English) is a chemical element; it has symbol Cs and atomic number 55. It is a soft, silvery-golden alkali metal with a melting point of 28.5 °C (83.3 °F; 301.6 K), which makes it one of only five elemental metals that are liquid at or near room temperature. Caesium has physical and chemical properties similar to those of rubidium and potassium. It is pyrophoric and reacts with water even at −116 °C (−177 °F). It is the least electronegative element, with a value of 0.79 on the Pauling scale. It has only one stable isotope, caesium-133. Caesium is mined mostly from pollucite.

Caesium-137, a fission product, is extracted from waste produced by nuclear reactors. It has the largest atomic radius of all elements whose radii have been measured or calculated, at about 260 picometers. It has a melting point of 28.5 °C (83.3 °F), making it one of the few elemental metals that are liquid near room temperature. Mercury is the only stable elemental metal with a known melting point lower than caesium.

Caesium-137 (13755Cs), cesium-137 (US), or radiocaesium, is a radioactive isotope of caesium that is formed as one of the more common fission products by the nuclear fission of uranium-235 and other fissionable isotopes in nuclear reactors and nuclear weapons. Trace quantities also originate from spontaneous fission of uranium-238. It is among the most problematic of the short-to-medium-lifetime fission products. Caesium-137 has a relatively low boiling point of 671 °C (1,240 °F) and easily becomes volatile when released suddenly at high temperature, as in the case of the Chernobyl nuclear accident and with atomic explosions, and can travel very long distances in the air. After being deposited onto the soil as radioactive fallout, it moves and spreads easily in the environment because of the high water solubility of caesium's most common chemical compounds, which are salts.

Caesium-137 reacts with water, producing a water-soluble compound (caesium hydroxide). The biological behavior of caesium is similar to that of potassium and rubidium.
Caesium-137, along with other radioactive isotopes caesium-134, iodine-131, xenon-133, and strontium-90, were released into the environment during nearly all nuclear weapon tests and some nuclear accidents, most notably the Chernobyl disaster and the Fukushima Daiichi disaster.

Caesium-137 in the environment is substantially anthropogenic (human-made), these bellwether isotopes are produced solely from anthropogenic sources. Caesium-137 is produced from the nuclear fission of plutonium and uranium, and decays into barium-137.

Nuclear isotope and safety hazards
Caesium-137 is a radioisotope commonly used as a gamma-emitter in industrial applications. Its advantages include a half-life of roughly 30 years, its availability from the nuclear fuel cycle, and having 137Ba as a stable end product. It has been used in agriculture, cancer treatment, and the sterilization of food, sewage sludge, and surgical equipment. Radioactive isotopes of caesium in radiation devices were used in the medical field to treat certain types of cancer, but emergence of better alternatives and the use of water-soluble caesium chloride in the sources, which could create wide-ranging contamination, gradually put some of these caesium sources out of use. Caesium-137 has been employed in a variety of industrial measurement gauges, including moisture, density, leveling, and thickness gauges. It has also been used in well-logging devices for measuring the electron density of the rock formations, which is analogous to the bulk density of the formations

The isotopes 134 and 137 are present in the biosphere in small amounts from human activities, differing by location. Radiocaesium does not accumulate in the body as readily as other fission products (such as radioiodine and radiostrontium). About 10% of absorbed radiocaesium washes out of the body relatively quickly in sweat and urine. The remaining 90% has a biological half-life between 50 and 150 days. Radiocaesium follows potassium and tends to accumulate in plant tissues, including fruits and vegetables. Plants vary widely in the absorption of caesium, sometimes displaying great resistance to it. It is also well-documented that mushrooms from contaminated forests accumulate radiocaesium (caesium-137) in the fungal sporocarps. Accumulation of caesium-137 in lakes has been a great concern after the Chernobyl disaster. Experiments with dogs showed that a single dose of 3.8 millicuries (140 MBq, 4.1 μg of caesium-137) per kilogram is lethal within three weeks; smaller amounts may cause infertility and cancer. The International Atomic Energy Agency and other sources have warned that radioactive materials, such as caesium-137, could be used in radiological dispersion devices, or "dirty bombs".

Fukushima Daiichi disaster
In April 2011, elevated levels of caesium-137 were also found in the environment after the Fukushima Daiichi nuclear disasters in Japan. In July 2011, meat from 11 cows shipped to Tokyo from Fukushima Prefecture was found to have 1,530 to 3,200 becquerels per kilogram of 137Cs, considerably exceeding the Japanese legal limit of 500 becquerels per kilogram at that time. In March 2013, a fish caught near the plant had a record 740,000 becquerels per kilogram of radioactive caesium, above the 100 becquerels per kilogram government limit. A 2013 paper in Scientific Reports found that for a forest site 50 km from the stricken plant, 137Cs concentrations were high in leaf litter, fungi, and detritivores, but low in herbivores. By the end of 2014, "Fukushima-derived radiocaesium had spread into the whole western North Pacific Ocean", transported by the North Pacific current from Japan to the Gulf of Alaska. It has been measured in the surface layer down to 200 meters and south of the current area down to 400 meters.

Radioactive materials were dispersed into the atmosphere immediately after the disaster and account for most of all such materials leaked into the environment. 80% of the initial atmospheric release eventually deposited over rivers and the Pacific Ocean, according to a UNSCEAR report in 2020. Specifically, "the total releases to the atmosphere of Iodine-131 and Caesium-137 ranged generally between about 100 to about 500 PBq [petabecquerel, 1015 Bq] and 6 to 20 PBq, respectively.

Once released into the atmosphere, those that remain in a gaseous phase will simply be diluted by the atmosphere, but some that precipitate will eventually settle on land or in the ocean. Thus, the majority (90~99%) of the radionuclides which are deposited are isotopes of iodine and caesium, with a small portion of tellurium, which is almost fully vaporized out of the core due to their low vapor pressure. The remaining fraction of deposited radionuclides are of less volatile elements such as barium, antimony, and niobium, of which less than a percent is evaporated from the fuel.

Approximately 40–80% of the atmospheric releases were deposited over the ocean.

In addition to atmospheric deposition, there was also a significant quantity of direct releases into groundwater (and eventually the ocean) through leaks of coolant that had been in direct contact with the fuel. Estimates for this release vary from 1 to 5.5 PBq. Although the majority had entered the ocean shortly following the accident, a significant fraction remains in the groundwater and continues to mix with coastal waters.

According to the French Institute for Radiological Protection and Nuclear Safety, the release from the accident represents the most important individual oceanic emissions of artificial radioactivity ever observed. The Fukushima coast has one of the world's strongest currents (Kuroshio Current). It transported the contaminated waters far into the Pacific Ocean, dispersing the radioactivity. As of late 2011 measurements of both the seawater and the coastal sediments suggested that the consequences for marine life would be minor.

Significant pollution along the coast near the plant might persist, because of the continuing arrival of radioactive material transported to the sea by surface water crossing contaminated soil. The possible presence of other radioactive substances, such as strontium-90 or plutonium, has not been sufficiently studied. Recent measurements show persistent contamination of some marine species (mostly fish) caught along the Fukushima coast.

Initial discharge
A large amount of caesium entered the sea from the initial atmospheric release. By 2013, the concentrations of caesium-137 in the Fukushima coastal waters were around the level before the accident. However, concentrations in coastal sediments declined more slowly than in coastal waters, and the amount of caesium-137 stored in sediments most likely exceeded that in the water column by 2020. The sediments may provide a long-term source of caesium-137 in the seawater.

Data on marine foods indicates their radioactive concentrations are falling towards initial levels. 41% of samples caught off the Fukushima coast in 2011 had caesium-137 concentrations above the legal limit (100 becquerels per kilogram), and this had declined to 0.05% in 2015. United States Food and Drug Administration stated in 2021 that "FDA has no evidence that radionuclides from the Fukushima incident are present in the U.S. food supply at levels that are unsafe". Yet, presenting the science alone has not helped customers to regain their trust in eating Fukushima fishery products.

2023 discharge
The most prevalent radionuclide in the wastewater is tritium. A total of 780 terabecquerels (TBq) will be released into the ocean at a rate of 22 TBq per year. Tritium is routinely released into the ocean from operating nuclear power plants, sometimes in much greater quantities. For comparison, the La Hague nuclear processing site in France released 11,400 TBq of tritium in the year of 2018. In addition, about 60,000 TBq of tritium is produced naturally in the atmosphere each year by cosmic rays.

Other radionuclides present in the wastewater, like caesium-137, are not normally released by nuclear power plants. However, the concentrations in the treated water are minuscule relative to regulation limits.

"There is consensus among scientists that the impact on health is minuscule, still, it can't be said the risk is zero, which is what causes controversy", Michiaki Kai, a Japanese nuclear expert, told AFP. David Bailey, a physicist whose lab measures radioactivity, said that with tritium at diluted concentrations, "there is no issue with marine species unless we see a severe decline in fish population".

Ferenc Dalnoki-Veress, a scientist-in-residence at the Middlebury Institute of International Studies at Monterey, said regarding dilution that bringing in living creatures makes the situation more complex. Robert Richmond, a biologist from the University of Hawaiʻi, told the BBC that the inadequate radiological and ecological assessment raises the concern that Japan would be unable to detect what enters the environment and "get the genie back in the bottle". Dalnoki-Veress, Richmond, and three other panelists consulting for the Pacific Islands Forum wrote that dilution may fail to account for bioaccumulation and exposure pathways that involve organically bound tritium (OBT).

Tuesday, March 5, 2024

Sun activity is not caused global warming, but by humans yes it is. Continued stress on greenhouse gases can significantly increase global high temperatures

Patterns of solar irradiance and solar variation have been the main drivers of climate change over the millions to billions of years of the geologic time scale.

Evidence that this is the case comes from analysis on many timescales and from many sources, including direct observations; composites from baskets of different proxy observations; and numerical climate models. On millennial timescales, paleoclimate indicators have been compared to cosmogenic isotope abundances as the latter are a proxy for solar activity. These have also been used on century times scales but, in addition, instrumental data are increasingly available (mainly telescopic observations of sunspots and thermometer measurements of air temperature) and show that, for example, the temperature fluctuations do not match the solar activity variations and that the commonly-invoked association of the Little Ice Age with the Maunder minimum is far too simplistic as, although solar variations may have played a minor role, a much bigger factor is known to be Little Ice Age volcanism. In recent decades observations of unprecedented accuracy, sensitivity, and scope (of both solar activity and terrestrial climate) have become available from spacecraft and show unequivocally that recent global warming is not caused by changes in the Sun.

Since 1978, solar irradiance has been directly measured by satellites with very good accuracy. These measurements indicate that the Sun's total solar irradiance fluctuates by +-0.1% over the ~11 years of the solar cycle, but that its average value has been stable since the measurements started in 1978. Solar irradiance before the 1970s is estimated using proxy variables, such as tree rings, the number of sunspots, and the abundances of cosmogenic isotopes such as 10Be, all of which are calibrated to the post-1978 direct measurements.

Solar activity has been on a declining trend since the 1960s, as indicated by solar cycles 19–24 (current solar cycles 25), in which the maximum number of sunspots were 201, 111, 165, 159, 121, and 82, respectively. In the three decades following 1978, the combination of solar and volcanic activity is estimated to have had a slight cooling influence. A 2010 study found that the composition of solar radiation might have changed slightly, with in an increase of ultraviolet radiation and a decrease in other wavelengths."

Solar variation theory

The link between recent solar activity and climate has been quantified and is not a major driver of the warming that has occurred since early in the twentieth century. Human-induced forcings are needed to reproduce the late-20th-century warming.

A 1994 study by the US National Research Council concluded that TSI variations were the most likely cause of significant climate change in the pre-industrial era before significant human-generated carbon dioxide entered the atmosphere.

Scafetta and West correlated solar proxy data and lower tropospheric temperature for the preindustrial era, before significant anthropogenic greenhouse forcing, suggesting that TSI variations may have contributed 50% of the warming observed between 1900 and 2000 (although they conclude "our estimates about the solar effect on climate might be overestimated and should be considered as an upper limit.") If interpreted as a detection rather than an upper limit, this would contrast with global climate models predicting that solar forcing of climate through direct radiative forcing makes an insignificant contribution.

In 2000, Stott and others reported on the most comprehensive model simulations of 20th-century climate to that date. Their study looked at both "natural forcing agents" (solar variations and volcanic emissions) as well as "anthropogenic forcing" (greenhouse gases and sulphate aerosols). They found that "solar effects may have contributed significantly to the warming in the first half of the century although this result is dependent on the reconstruction of total solar irradiance that is used. In the latter half of the century, we find that anthropogenic increases in greenhouse gases are largely responsible for the observed warming, balanced by some cooling due to anthropogenic sulphate aerosols, with no evidence for significant solar effects." Stott's group found that combining these factors enabled them to closely simulate global temperature changes throughout the 20th century. They predicted that continued greenhouse gas emissions would cause additional future temperature increases "at a rate similar to that observed in recent decades". In addition, the study notes "uncertainties in historical forcing" — in other words, past natural forcing may still be having a delayed warming effect, most likely due to the oceans.

Stott's 2003 work largely revised his assessment, and found a significant solar contribution to recent warming, although still smaller (between 16 and 36%) than that of greenhouse gases.

A study in 2004 concluded that solar activity affects the climate - based on sunspot activity, yet plays only a small role in the current global warming.

Human activities increased Anthropogenic Global Warming and temperature change

Greenhouse gases, such as CO2, methane, and nitrous oxide, heat the climate system by trapping infrared light. Volcanoes are also part of the extended carbon cycle. Since the industrial revolution, humanity has been adding to greenhouse gases by emitting CO2 from fossil fuel combustion, changing land use through deforestation, and further altering the climate with aerosols (particulate matter in the atmosphere), release of trace gases (e.g. nitrogen oxides, carbon monoxide, or methane). Other factors, including land use, ozone depletion, animal husbandry (ruminant animals such as cattle produce methane), and deforestation, also play a role.

The US Geological Survey estimates that volcanic emissions are at a much lower level than the effects of current human activities, which generate 100–300 times the amount of carbon dioxide emitted by volcanoes. The annual amount put out by human activities may be greater than the amount released by supereruptions.

As a consequence of humans emitting greenhouse gases, global surface temperatures have started rising. Global warming is an aspect of modern climate change, a term that also includes the observed changes in precipitation, storm tracks, and cloudiness. As a consequence, glaciers worldwide have been found to be shrinking significantly. Land ice sheets in both Antarctica and Greenland have been losing mass since 2002 and have seen an acceleration of ice mass loss since 2009. Global sea levels have been rising as a consequence of thermal expansion and ice melt. The decline in Arctic sea ice, both in extent and thickness, over the last several decades is further evidence of rapid climate change.

Changes in global temperatures over the past century provide evidence for the effects of increasing greenhouse gasses. When the climate system reacts to such changes, climate change follows. Measurement of the GST (global surface temperature) is one of the many lines of evidence supporting the scientific consensus on climate change, which is that humans are causing warming of Earth's climate system.

The climate system receives nearly all of its energy from the sun and radiates energy to outer space. The balance of incoming and outgoing energy and the passage of energy through the climate system is Earth's energy budget. When the incoming energy is greater than the outgoing energy, Earth's energy budget is positive and the climate system is warming. If more energy goes out, the energy budget is negative and Earth experiences cooling.

Global warming affects all parts of Earth's climate system. Global surface temperatures have risen by 1.1 °C (2.0 °F). Scientists say they will rise further in the future. The changes in climate are not uniform across the Earth. In particular, most land areas have warmed faster than most ocean areas. The Arctic is warming faster than most other regions. Night-time temperatures have increased faster than daytime temperatures. The impact on nature and people depends on how much more the Earth warms.

Scientists use several methods to predict the effects of human-caused climate change. One is to investigate past natural changes in climate. To assess changes in Earth's past climate scientists have studied tree rings, ice cores, corals, and ocean and lake sediments. These show that recent temperatures have surpassed anything in the last 2,000 years. By the end of the 21st century, at that time, mean global temperatures were about 2–4 °C (3.6–7.2 °F) warmer than pre-industrial temperatures. The modern observed rise in temperature and CO2 concentrations has been rapid, even abrupt geophysical events in Earth's history do not approach current rates.

Land and Oceans are rapidly affected by Climate change

Climate change affects the physical environment, ecosystems, and human societies. Changes in the climate system include an overall warming trend, more extreme weather, and rising sea levels. These in turn impact nature and wildlife, as well as human settlements and societies. The effects of human-caused climate change are broad and far-reaching. This is especially so if there is no significant climate action. Experts sometimes describe the projected and observed negative impacts of climate change as the climate crisis.

The changes in climate are not uniform across the Earth. In particular, most land areas have warmed faster than most ocean areas. The Arctic is warming faster than most other regions. There are many effects of climate change on oceans. These include an increase in ocean temperatures, a rise in sea level from ocean warming, and ice sheet melting. They include increased ocean stratification. They also include changes to ocean currents including a weakening of the Atlantic meridional overturning circulation. Carbon dioxide from the atmosphere is acidifying the ocean.

Recent warming has had a big effect on natural biological systems. It has degraded land by raising temperatures, drying soils, and increasing wildfire risk. Species all over the world are migrating towards the poles to colder areas. On land, many species move to higher ground, whereas marine species seek colder water at greater depths. At 2 °C (3.6 °F) of warming, around 10% of species on land would become critically endangered.

Saturday, February 3, 2024

Rugged, lower power and more resistant to physical shock of NAND flash-based SSDs is about to replace rotating-platters HDDs in a few years

A hard disk drive (HDD), is an electro-mechanical data storage device that stores and retrieves digital data using magnetic storage with one or more rigid rapidly rotating platters coated with magnetic material. The platters are made from a non-magnetic material, usually aluminum alloy, glass, or ceramic. The platters are paired with magnetic heads, usually arranged on a moving actuator arm, which read and write data to the platter surfaces. Data is accessed in a random-access manner, meaning that individual blocks of data can be stored and retrieved in any order. HDDs are a type of non-volatile storage, retains stored data when powered off. Modern HDDs are typically in the form of a small rectangular box.

The two most common form factors for modern HDDs are 3.5-inch, for desktop computers, and 2.5-inch, primarily for laptops. HDDs are connected to systems by standard interface cables such as PATA (Parallel ATA), SATA (Serial ATA), USB or SAS (Serial Attached SCSI) cables, and Fibre Channel.

Introduced by IBM in 1956, HDDs were the dominant secondary storage device for general-purpose computers beginning in the early 1960s. HDDs maintained this position in the modern era of servers and personal computers, though personal computing devices produced in large volumes, like mobile phones and tablets, rely on flash memory storage devices.

In the 2000s and 2010s, NAND flash-based SSDs began supplanting HDDs in applications requiring portability or high performance. NAND performance is improving faster than HDDs, and applications for HDDs are eroding. The highest-capacity HDDs shipping commercially in 2022 are 26 TB, while the largest capacity SSDs had a capacity of 100 TB. HDD unit shipments peaked at 651 million units in 2010 and have been declining since then to 166 million units in 2022.

Advantages of SSDs over Traditional spinning platters


A solid-state drive (SSD) is a solid-state storage device that uses integrated circuit assemblies to store data persistently, typically using flash memory and functions as secondary storage in the hierarchy of computer storage. It is also sometimes called a semiconductor storage device, a solid-state device, or a solid-state disk, even though SSDs lack the physical spinning disks and movable read-write heads used in hard disk drives (HDDs) and floppy disks. SSD also has rich internal parallelism for data processing. Solid-state drives (SSDs) have higher data-transfer rates, higher areal storage density, somewhat better reliability, and much lower latency and access times.

Flash-based SSDs store data in metal–oxide–semiconductor (MOS) integrated circuit chips which contain non-volatile floating-gate memory cells. Flash memory-based solutions are typically packaged in standard disk drive form factors (1.8-, 2.5-, and 3.5-inch), but also in smaller more compact form factors, such as the M.2 form factor, made possible by the small size of flash memory.

The key components of an SSD are the controller and the memory to store the data. The primary memory component in an SSD was traditionally DRAM volatile memory, but since 2009, it has been more commonly NAND flash non-volatile memory. Every SSD includes a controller that incorporates the electronics that bridge the NAND memory components to the host computer. The controller is an embedded processor that executes firmware-level code and is one of the most important factors of SSD performance.

In comparison to hard disk drives and similar electromechanical media which use moving parts, SSDs are typically more resistant to physical shock, run silently, and have higher input/output rates and lower latency. SSDs based on NAND flash will slowly leak charge over time if left for long periods without power. This causes worn-out drives (that have exceeded their endurance rating) to start losing data typically after one year (if stored at 30 °C) to two years (at 25 °C) in storage; for new drives, it takes longer. Therefore, SSDs are not suitable for archival storage. SSDs have a limited lifetime number of writes and also slow down as they reach their full storage capacity.

Due to the extremely close spacing between the heads and the disk surface, HDDs are vulnerable to being damaged by a head crash – a failure of the disk in which the head scrapes across the platter surface, often grinding away the thin magnetic film and causing data loss. Head crashes can be caused by electronic failure, a sudden power failure, physical shock, contamination of the drive's internal enclosure, wear and tear, corrosion, or poorly manufactured platters and heads.

Most of the advantages of solid-state drives over traditional hard drives are due to their ability to access data completely electronically instead of electromechanically, resulting in superior transfer speeds and mechanical ruggedness.

Flash memory as a replacement for hard drives

The size and shape of any device are largely driven by the size and shape of the components used to make that device. Traditional HDDs and optical drives are designed around the rotating platter(s) or optical disc along with the spindle motor inside. Since an SSD is made up of various interconnected integrated circuits (ICs) and an interface connector, its shape is no longer limited to the shape of rotating media drives. Some solid-state storage solutions come in a larger chassis that may even be a rack-mount form factor with numerous SSDs inside. They would all connect to a common bus inside the chassis and connect outside the box with a single connector. As of 2014, mSATA and M.2 form factors also gained popularity, primarily in laptops.

M.2 form factor, formerly known as the Next Generation Form Factor (NGFF), is a natural transition from the mSATA and physical layout it used, to a more usable and more advanced form factor. While mSATA took advantage of an existing form factor and connector, M.2 has been designed to maximize usage of the card space, while minimizing the footprint. The M.2 standard allows both SATA and PCI Express SSDs to be fitted onto M.2 modules. The SSD was designed to be installed permanently inside a computer.
 
Due to their generally prohibitive cost versus HDDs at the time, until 2009, SSDs were mainly used in those aspects of mission-critical applications where the speed of the storage system needed to be as high as possible. Since flash memory has become a common component of SSDs, the falling prices and increased densities have made it more cost-effective for many other applications. For instance, in the distributed computing environment, SSDs can be used as the building block for a distributed cache layer that temporarily absorbs the large volume of user requests to the slower HDD-based backend storage system. This layer provides much higher bandwidth and lower latency than the storage system and can be managed in a number of forms, such as distributed key-value databases and distributed file systems. On supercomputers, this layer is typically referred to as a burst buffer. With this fast layer, users often experience shorter system response times. Organizations that can benefit from faster access to system data include equity trading companies, telecommunication corporations, and streaming media and video editing firms. The list of applications that could benefit from faster storage is vast.

Flash-based solid-state drives can be used to create network appliances from general-purpose personal computer hardware. A write-protected flash drive containing the operating system and application software can substitute for larger, less reliable disk drives or CD-ROMs. Appliances built this way can provide an inexpensive alternative to expensive router and firewall hardware.

SSDs based on an SD card with a live SD operating system are easily write-locked. Combined with a cloud computing environment or other writable medium, to maintain persistence, an OS booted from a write-locked SD card is robust, rugged, reliable, and impervious to permanent corruption. If the running OS degrades, simply turning the machine off and then on returns it back to its initial uncorrupted state and thus is particularly solid. The SD card installed OS does not require removal of corrupted components since it was write-locked though any written media may need to be restored.

One source states that, in 2008, the flash memory industry included about US$9.1 billion in production and sales. Other sources put the flash memory market at a size of more than US$20 billion in 2006, accounting for more than eight percent of the overall semiconductor market and more than 34 percent of the total semiconductor memory market. In 2012, the market was estimated at $26.8 billion, it can take up to 10 weeks to produce a flash memory chip. Samsung remains the largest NAND flash memory manufacturer as of the first quarter 2022.

Technology assessment (TA, German: Technikfolgenabschätzung, French: Évaluation des choix scientifiques et technologiques) is a practical process of determining the value of a new or emerging technology in and of itself or against existing technologies. This is a means of assessing and rating the new technology from the time when it was first developed to the time when it is potentially accepted by the public and authorities for further use. In essence, TA could be defined as "a form of policy research that examines short- and long-term consequences (for example, societal, economic, ethical, legal) of the application of technology."

TA is the study and evaluation of new technologies. It is a way of trying to forecast and prepare for the upcoming technological advancements and their repercussions on society, and then make decisions based on the judgments. It is based on the conviction that new developments within, and discoveries by, the scientific community are relevant for the world at large rather than just for the scientific experts themselves and that technological progress can never be free of ethical implications. Also, technology assessment recognizes the fact that scientists normally are not trained ethicists themselves and accordingly ought to be very careful when passing ethical judgment on their own, or their colleagues, new findings, projects, or work in progress. TA is a very broad phenomenon that also includes aspects such as "diffusion of technology (and technology transfer), factors leading to rapid acceptance of new technology, and the role of technology and society."

Technology assessment assumes a global perspective and is future-oriented, not anti-technological. TA considers its task as an interdisciplinary approach to solving already existing problems and preventing potential damage caused by the uncritical application and commercialization of new technologies.

Therefore, any results of technology assessment studies must be published, and particular consideration must be given to communication with political decision-makers.

Tuesday, January 2, 2024

Synthetic Formaldehyde “known to be a human carcinogen” in common indoor pollutant and contaminant in foods

Formaldehyde (systematic name methanal) is an organic compound with the formula CH2O and structure H−CHO. The compound is a pungent, colorless gas that polymerizes spontaneously into paraformaldehyde. It is stored as aqueous solutions (formalin), which consists mainly of the hydrate CH2(OH)2. It is produced commercially as a precursor to many other materials and chemical compounds. In 2006, the global production rate of formaldehyde was estimated at 12 million tons per year. It is mainly used in the production of industrial resins, e.g., for particle boards and coatings. Small amounts also occur naturally.

Formaldehyde is produced industrially by the catalytic oxidation of methanol. The most common catalysts are silver metal, iron(III) oxide, iron molybdenum oxides [e.g. iron(III) molybdate] with a molybdenum-enriched surface or vanadium oxides. In the commonly used formox process, methanol and oxygen react at c. 250–400 °C in the presence of iron oxide in combination with molybdenum and/or vanadium to produce formaldehyde according to the chemical equation:

2 CH3OH + O2 → 2 CH2O + 2 H2O

The silver-based catalyst usually operates at a higher temperature, about 650 °C. Two chemical reactions on it simultaneously produce formaldehyde: shown above and the dehydrogenation reaction:

CH3OH → CH2O + H2

In principle, formaldehyde could be generated by oxidation of methane, but this route is not industrially viable because the methanol is more easily oxidized than methane.

In Biochemistry - Formaldehyde is produced via several enzyme-catalyzed routes. Living beings, including humans, produce formaldehyde as part of their metabolism. Formaldehyde is key to several bodily functions (e.g. epigenetics), but its amount must also be tightly controlled to avoid self-poisoning.

Occurrence
Processes in the upper atmosphere contribute up to 90% of the total formaldehyde in the environment. Formaldehyde is an intermediate in the oxidation (or combustion) of methane, as well as of other carbon compounds, e.g. in forest fires, automobile exhaust, and tobacco smoke. When produced in the atmosphere by the action of sunlight and oxygen on atmospheric methane and other hydrocarbons, it becomes part of smog. Formaldehyde has also been detected in outer space.

Formaldehyde and its adducts are ubiquitous in nature. Food may contain formaldehyde at levels 1–100 mg/kg. Formaldehyde, formed in the metabolism of the amino acids serine and threonine, is found in the bloodstream of humans and other primates at concentrations of approximately 50 micromolar.

Formaldehyde does not accumulate in the environment, because it is broken down within a few hours by sunlight or by bacteria present in soil or water. Humans metabolize formaldehyde quickly, converting it to formic acid, so it does not accumulate. It nonetheless presents significant health concerns, as a contaminant.

Formaldehyde is classified as a carcinogen (agents in the environment capable of contributing to cancer growth). Additionally, it can cause respiratory and skin irritation upon exposure.

Industrial applications
Formaldehyde is a common precursor to more complex compounds and materials. In approximate order of decreasing consumption, products generated from formaldehyde include urea formaldehyde resin, melamine resin, phenol formaldehyde resin, polyoxymethylene plastics, 1,4-butanediol, and methylene diphenyl diisocyanate. The textile industry uses formaldehyde-based resins as finishers to make fabrics crease-resistant.

When treated with phenol, urea, or melamine, formaldehyde produces, respectively, hard thermoset phenol formaldehyde resin, urea formaldehyde resin, and melamine resin. These polymers are permanent adhesives used in plywood and carpeting. They are also foamed to make insulation, or cast into molded products. Production of formaldehyde resins accounts for more than half of formaldehyde consumption.

Disinfectant and biocide
An aqueous solution of formaldehyde can be useful as a disinfectant as it kills most bacteria and fungi (including their spores). It is used as an additive in vaccine manufacturing to inactivate toxins and pathogens. Formaldehyde releasers are used as biocides in personal care products such as cosmetics. Although present at levels not normally considered harmful, they are known to cause allergic contact dermatitis in certain sensitized individuals.

Aquarists use formaldehyde as a treatment for the parasites Ichthyophthirius multifiliis and Cryptocaryon irritans. Formaldehyde is one of the main disinfectants recommended for destroying anthrax.

Formaldehyde is also approved for use in the manufacture of animal feeds in the US. It is an antimicrobial agent used to maintain complete animal feeds or feed ingredients Salmonella negative for up to 21 days.

Formaldehyde is commonly used to disinfect (via fumigating, sprinklers, and spray sleds) poultry and swine confinement buildings, egg hatcheries, rooms, railway cars, mushroom houses, tools, and equipment. Formaldehyde is a valuable packaged preservative in the food and beverage industry.

Safety
Because of its widespread use, toxicity, and volatility, formaldehyde poses a significant danger to human health. In 2011, the US National Toxicology Program described formaldehyde as "known to be a human carcinogen".

Chronic inhalation
However, concerns are associated with chronic (long-term) exposure by inhalation as may happen from thermal or chemical decomposition of formaldehyde-based resins and the production of formaldehyde resulting from the combustion of a variety of organic compounds (for example, exhaust gases). As formaldehyde resins are used in many construction materials, it is one of the more common indoor air pollutants. At concentrations above 0.1 ppm in air, formaldehyde can irritate the eyes and mucous membranes. Formaldehyde inhaled at this concentration may cause headaches, a burning sensation in the throat, and difficulty breathing, and can trigger or aggravate asthma symptoms.

In the residential environment, formaldehyde exposure comes from several routes; formaldehyde can be emitted by treated wood products, such as plywood or particle board, but it is produced by paints, varnishes, floor finishes, and cigarette smoking as well. In July 2016, the U.S. EPA released a prepublication version of its final rule on Formaldehyde Emission Standards for Composite Wood Products. These new rules impact manufacturers, importers, distributors, and retailers of products containing composite wood, including fiberboard, particleboard, and various laminated products, who must comply with more stringent record-keeping and labeling requirements.

In the building environments, formaldehyde levels are affected by several factors. These include the potency of formaldehyde-emitting products present, the ratio of the surface area of emitting materials to volume of space, environmental factors, product age, interactions with other materials, and ventilation conditions. Formaldehyde is emitted from a variety of construction materials, furnishings, and consumer products.

Other routes
Formaldehyde occurs naturally, and is "an essential intermediate in cellular metabolism in mammals and humans." According to the American Chemistry Council, "Formaldehyde is found in every living system—from plants to animals to humans. It metabolizes quickly in the body, breaks down rapidly, is not persistent, and does not accumulate in the body."

In humans, ingestion of as little as 30 milliliters (1.0 US fl oz) of 37% formaldehyde solution can cause death. Other symptoms associated with ingesting such a solution include gastrointestinal damage (vomiting, abdominal pain), and systematic damage (dizziness). Testing for formaldehyde is done by blood and/or urine by gas chromatography–mass spectrometry. Other methods include infrared detection, gas detector tubes, etc., of which high-performance liquid chromatography is the most sensitive.

Contaminant in food
Scandals have broken in both the 2005 Indonesia food scare and 2007 Vietnam food scare regarding the addition of formaldehyde to foods to extend shelf life. In 2011, after a four-year absence, Indonesian authorities found foods with formaldehyde being sold in markets in a number of regions across the country. In August 2011, at least at two Carrefour supermarkets, the Central Jakarta Livestock and Fishery Sub-Department found cendol containing 10 parts per million of formaldehyde. In 2014, the owner of two noodle factories in Bogor, Indonesia, was arrested for using formaldehyde in noodles. 50 kg of formaldehyde was confiscated. Foods known to be contaminated included noodles, salted fish, and tofu. Chicken and beer were also rumored to be contaminated. In some places, such as China, manufacturers still use formaldehyde illegally as a preservative in foods, which exposes people to formaldehyde ingestion. In the early 1900s, it was frequently added by US milk plants to milk bottles as a method of pasteurization due to the lack of knowledge and concern regarding formaldehyde's toxicity.

In 2011 in Nakhon Ratchasima, Thailand, truckloads of rotten chicken were treated with formaldehyde for sale in which "a large network", including 11 slaughterhouses run by a criminal gang, were implicated. In 2012, 1 billion rupiah (almost US$100,000) of fish imported from Pakistan to Batam, Indonesia, were found laced with formaldehyde.

Formalin contamination of foods has been reported in Bangladesh, with stores and supermarkets selling fruits, fishes, and vegetables that have been treated with formalin to keep them fresh. However, in 2015, a Formalin Control Bill was passed in the Parliament of Bangladesh with a provision of life-term imprisonment as the maximum punishment as well as a maximum fine of 2,000,000 BDT but not less than 500,000 BDT for importing, producing, or hoarding formalin without a license.
 
Google SEO sponsored by Red Dragon Electric Cigarette Products