Saturday, May 28, 2016

Lodestone

A lodestone is a naturally magnetized piece of the mineral magnetite. They are naturally-occurring magnets, which can attract iron. The property of magnetism was first discovered in antiquity through magnetic compasses, and their importance to early navigation is indicated by the name lodestone, which in Middle English means 'course stone' or 'leading stone', from the now-obsolete meaning of lode as ‘journey, way’.
lodestones. Pieces of lodestone, suspended so they could turn, were the first
Lodestone is one of the few minerals that is found naturally magnetized. Magnetite is black or brownish-black, with a metallic luster, a Mohs hardness of 5.5–6.5 and a black streak.

Origin

The process by which lodestone is created has long been an open question in geology. Only a small amount of the magnetite on Earth is found magnetized as lodestone. Ordinary magnetite is attracted to a magnetic field like iron and steel is, but does not tend to become magnetized itself; it has too low a magnetic coercivity (resistance to demagnetization) to stay magnetized for long.
Microscopic examination of lodestones has found them to be made of magnetite (Fe3O4) with inclusions of maghemite (cubic Fe2O3), often with impurity metal ions of titanium, aluminium, and manganese. This inhomogeneous crystalline structure gives this variety of magnetite sufficient coercivity to remain magnetized and thus be a permanent magnet.
The other question is how lodestones get magnetized. The Earth's magnetic field at 0.5 gauss is too weak to magnetize a lodestone by itself. The leading theory is that lodestones are magnetized by the strong magnetic fields surrounding lightning bolts. This is supported by the observation that they are mostly found near the surface of the Earth, rather than buried at great depth.

Properties
Lodestones were used as an early form of magnetic compass. Magnetite typically carries the
dominant magnetic signature in rocks, and so it has been a critical tool in paleomagnetism, a science important in understanding plate tectonics and as historic data for magnetohydrodynamics and other scientific fields. The relationships between magnetite and other iron-rich oxide minerals such as ilmenite, hematite, and ulvospinel have been much studied; the reactions between these minerals and oxygen influence how and when magnetite preserves a record of the Earth's magnetic field.
Magnetite has been very important in understanding the conditions under which rocks form.
Magnetite reacts with oxygen to produce hematite, and the mineral pair forms a buffer that can control oxygen fugacity. Commonly, igneous rocks contain grains of two solid solutions, one of magnetite and ulvospinel and the other of ilmenite and hematite. Compositions of the mineral pairs are used to calculate how oxidizing was the magma (i.e., the oxygen fugacity of the magma): a range of oxidizing conditions are found in magmas and the oxidation state helps to determine how the magmas might evolve by fractional crystallization.
Magnetite also occurs in many sedimentary rocks, including banded iron formations. In many igneous rocks, magnetite-rich and ilmenite-rich grains occur that precipitated together in magma. Magnetite also is produced from peridotites and dunites by serpentinization.
The Curie temperature of magnetite is 858 K (585 °C; 1,085 °F).

Applications
1) Magnetic recording
Audio recording using
magnetic acetate tape was developed in the 1930s. The German magnetophon utilized magnetite powder as the recording medium. Following World War II the 3M company continued work on the German design. In 1946 the 3M researchers found they could improve the magnetite based tape, which utilized powders of cubic crystals, by replacing the magnetite with needle shaped particles of gamma ferric oxide (γ-Fe2O3).


2)  Catalysis
Magnetite is the catalyst for the industrial synthesis of ammonia

3)  Arsenic sorbent
Magnetite powder efficiently removes arsenic(III) and arsenic(V) from water, the efficiency of which increases ~200 times when the magnetite particle size decreases from 300 to 12 nm. Arsenic-contaminated drinking water is a major problem around the world, which can be solved using magnetite as a sorbent.

4)  Other
Because of its stability at high temperatures, it is used for coating industrial watertube steam boilers. The magnetite layer is formed after a chemical treatment (e.g. by using hydrazine).
Iron-metabolizing bacteria can trigger redox reactions in microscopic magnetite particles. Using light, magnetite can reduce chromium (VI) (its toxic form), converting it to less toxic chromium (III), which can then be incorporated into a harmless magnetite crystal. Phototrophic Rhodopseudomonas palustris oxidized the magnetite, while Geobacter sulfurreducens reduced it, readying it for another cycle.

Monday, April 11, 2016

Pyrogen (fever)

Fever, also known as pyrexia and febrile response, is defined as having a temperature above the normal range due to an increase in the body's temperature set-point. There is not a single agreed-upon upper limit for normal temperature with sources using values between 37.5 and 38.3 °C (99.5 and 100.9 °F). The increase in set-point triggers increased muscle contraction and causes a feeling of cold. This results in greater heat production and efforts to conserve heat. When the set-point temperature returns to normal a person feels hot, becomes flushed, and may begin to sweat. Rarely a fever may trigger a febrile seizure. This is more common in young children. Fevers do not typically go higher than 41 to 42 °C (105.8 to 107.6 °F).
A fever can be caused by many medical conditions ranging from the not serious to potentially viral, bacterial and parasitic infections such as the common cold, urinary tract infections, meningitis, malaria and appendicitis among others. Non-infectious causes include vasculitis, deep vein thrombosis, side effects of medication, and cancer among others. It differs from hyperthermia, in that hyperthermia is an increase in body temperature over the temperature set-point, due to either too much heat production or not enough heat loss.
serious. This includes Treatment to reduce fever is generally not required. Treatment of associated pain and inflammation, however, may be useful and help a person rest. Medications such as ibuprofen or paracetamol may help with this as well as lower temperature. Measures such as putting a cool damp cloth on the forehead and having a slightly warm bath are not useful and may simply make a person more uncomfortable. Children younger than three months, people with serious medical problems such as a compromised immune system, and people with other symptoms may require medical attention. Hyperthermia does require treatment.
Fever is one of the most common medical signs. It is part of about 30% of healthcare visits by children and occurs in up to 75% of adults who are seriously sick. While fever is a useful defense mechanism, treating fever does not appear to worsen outcomes. Fever is viewed with greater concern by parents and healthcare professionals than it usually deserves, a phenomenon known as fever phobia.

Pyrogens
A pyrogen is a substance that induces fever. These can be either internal (endogenous) or external (exogenous) to the body. The bacterial substance lipopolysaccharide (LPS), present in the cell wall of some bacteria, is an example of an exogenous pyrogen. Pyrogenicity can vary: In extreme examples, some bacterial pyrogens known as superantigens can cause rapid and dangerous fevers. Depyrogenation may be achieved through filtration, distillation, chromatography, or inactivation.

Hypothalamus
The brain ultimately orchestrates heat effector mechanisms via the autonomic nervous system. These may be:
In infants, the autonomic nervous system may also activate brown adipose tissue to produce heat (non-exercise-associated thermogenesis, also known as non-shivering thermogenesis). Increased heart rate and vasoconstriction contribute to increased blood pressure in fever.

Usefulness
There are arguments for and against the usefulness of fever, and the issue is controversial. There are studies using warm-blooded vertebrates and humans in vivo, with some suggesting that they recover more rapidly from infections or critical illness due to fever. Studies suggest reduced mortality in bacterial infections when fever was present.
In theory, fever can aid in host defence. There are certainly some important immunological reactions that are sped up by temperature, and some pathogens with strict temperature preferences could be hindered.
Research has demonstrated that fever assists the healing process in several important ways:

Management
Fever should not necessarily be treated. Most people recover without specific medical attention. Although it is unpleasant, fever rarely rises to a dangerous level even if untreated. Damage to the brain generally does not occur until temperatures reach 42 °C (107.6 °F), and it is rare for an untreated fever to exceed 40.6 °C (105 °F).


Conservative measures
Some limited evidence supports sponging or bathing feverish children with tepid water. The use of a fan or air conditioning may somewhat reduce the temperature and increase comfort. If the temperature reaches the extremely high level of hyperpyrexia, aggressive cooling is required. In general, people are advised to keep adequately hydrated. Whether increased fluid intake improves symptoms or shortens respiratory illnesses such as the common cold is not known

Medications
Medications that lower fevers are called antipyretics. The antipyretic ibuprofen is effective in reducing fevers in children. It is more effective than acetaminophen (paracetamol) in children.
Ibuprofen and acetaminophen may be safely used together in children with fevers. The efficacy of acetaminophen by itself in children with fevers has been questioned. Ibuprofen is also superior to aspirin in children with fevers. Additionally, aspirin is not recommended in children and young adults (those under the age of 16 or 19 depending on the country) due to the risk of Reye's syndrome.
Using both paracetamol and ibuprofen at the same time or alternating between the two is more effective at decreasing fever than using only paracetamol or ibuprofen. It is not clear if it increases child comfort. Response or nonresponse to medications does not predict whether or not a child has a serious illness
 
 

Friday, February 26, 2016

Automatic fire suppression

Automatic fire suppression systems control and extinguish fires without human intervention. fire sprinkler system, gaseous fire suppression, and condensed aerosol fire suppression.

Examples of automatic systems include
The first fire extinguisher patent was issued to Alanson Crane of Virginia on Feb. 10, 1863. The first fire sprinkler system was patented by H.W. Pratt in 1872. But the first practical automatic sprinkler system was invented in 1874 by Henry S. Parmalee of New Haven, CT. He installed the system in a piano factory he owned.

Types of automatic systems
Today there are numerous types of Automatic Fire Suppression Systems. Systems are as diverse as the many applications. In general, however, Automatic Fire Suppression Systems fall into two categories: engineered and pre-engineered systems.

Engineered Fire Suppression Systems are design specific. Engineered systems are usually for

larger installations where the system is designed for the particular application. Examples include marine and land vehicle applications, computer clean rooms, public and private buildings, industrial paint lines, dip tanks and electrical switch rooms. Engineered systems use a number of gaseous or solid agents. Many are specifically formulated. Some, such as 3M Novec 1230 Fire Protection Fluid, are stored as a liquid and discharged as a gas.

Pre-Engineered Fire Suppression Systems use pre-designed elements to eliminate the need for
potassium carbonate or monoammonium phosphate (MAP), to protect spaces such as paint rooms and surfactant additive, and target retrofit applications where the risk of fire or fire injury is high but where a conventional fire sprinkler system would be unacceptably expensive. In addition, residential range hood fire suppression systems are becoming more common in shared-use cooking spaces, such as those found in assisted living facilities, hospice homes, and group homes.
booths, storage areas and commercial kitchens. A small number of residential designs have also emerged that typically employ water mist with or without a engineering work beyond the original product design. Typical industrial solutions use a simple wet or dry chemical agent, such as those found in assisted living facilities, hospice homes, and group homes.

Components
By definition, an automatic fire suppression system can operate without human intervention. To do so it must possess a means of detection, actuation and delivery.

In many systems, detection is accomplished by mechanical or electrical means. Mechanical detection uses fusible-link or thermo-bulb detectors. These detectors are designed to separate at a specific temperature and release tension on a release mechanism. Electrical detection uses heat detectors equipped with self-restoring, normally-open contacts which close when a predetermined temperature is reached. Remote and local manual operation is also possible.
Actuation usually involves either a pressurised fluid and a release valve, or in some cases an electric pump.
Delivery is accomplished by means of piping and nozzles. Nozzle design is specific to the agent used and coverage desired.


Extinguishing agents
In the early days, water was the exclusive fire suppression agent. Although still used today, water has limitations. Most notably, its liquid and conductive properties can cause as much property damage as fire itself.

AgentPrimary IngredientApplications
HFC 227ea (e.g.FM-200)HeptafluoropropaneElectronics, medical equipment, production equipment, libraries, data centers, medical record rooms, server rooms, oil pumping stations, engine compartments, telecommunications rooms, switch rooms, engine and machinery spaces, pump rooms, control rooms
 
FK-5-1-12 (3M Novec 1230 Fire Protection Fluid)Fluorinated KetoneElectronics, medical equipment, production equipment, libraries, data centers, medical record rooms, server rooms,
oil pumping stations, engine compartments, telecommunications rooms, switch rooms, engine and machinery spaces, pump rooms, control rooms

Health and environmental concerns
Despite their effectiveness, chemical fire extinguishing agents are not without disadvantages. In the early 20th century, carbon tetrachloride was extensively used as a dry cleaning solvent, a refrigerant and as a fire extinguishing agent. In time, it was found carbon tetrachloride could lead to severe health effects.
From the mid-1960s Halon 1301 was the industry standard for protecting high-value assets from the

threat of fire. Halon 1301 had many benefits as a fire suppression agent; it is fast-acting, safe for assets and required minimal storage space. Halon 1301's major drawbacks are that it depletes atmospheric ozone and is potentially harmful to humans.
Since 1987, some 191 nations have signed The Montreal Protocol on Substances That Deplete the Ozone Layer. The Protocol is an international treaty designed to protect the ozone layer by phasing out the production of a number of substances believed to be responsible for ozone depletion. Among these were halogenated hydrocarbons often used in fire suppression. As a result, manufacturers have focused on alternatives to Halon 1301 and Halon 1211 (halogenated hydrocarbons).
A number of countries have also taken steps to mandate the removal of installed Halon systems. Most notably these include Germany and Australia, the first two countries in the world to require this action. In both of these countries complete removal of installed Halon systems has been completed except for a very few essential-use applications. The European Union is currently undergoing a similar mandated removal of installed Halon systems.


Modern systems
Since the early 1990s manufacturers have successfully developed safe and effective Halon alternatives. These include DuPont FM-200, American Pacific’s Halotron and 3M Novec 1230 Fire Protection Fluid. Generally, the Halon replacement agents available today fall into two broad categories, in-kind (gaseous extinguishing agents) or not in-kind (alternative technologies). In-kind gaseous agents generally fall into two further categories, halocarbons and inert gases. Not in-kind alternatives include such options as water mist or the use of early warning smoke detection systems.
 


Tuesday, February 16, 2016

El Niño

El Niño /ɛl ˈnnj/ (Spanish pronunciation: [el ˈniɲo]) is the warm phase of the El Niño Southern Oscillation (commonly called ENSO) and is associated with a band of warm ocean water that Pacific International Date Line and 120°W), including off the Pacific coast of South America. El Niño Southern Oscillation refers to the cycle of warm and cold temperatures, as measured by sea surface temperature, SST, of the tropical central and eastern Pacific Ocean. El Niño is accompanied by high air pressure in the western Pacific and low air pressure in the eastern Pacific. The cool phase of ENSO is called "La Niña" with SST in the eastern Pacific below average and air pressures high in the eastern and low in western Pacific. The ENSO cycle, both El Niño and La Niña, causes global changes of both temperatures and rainfall. Mechanisms that cause the oscillation remain under study.
(between approximately the
develops in the central and east-central equatorial


Definition
El Niño is defined by prolonged warming in the Pacific Ocean sea surface temperatures when compared with the average value.
The U.S NOAA definition is a 3-month average warming of at least 0.5 °C (0.9 °F) in a specific area of the east-central tropical Pacific Ocean; other organizations define the term slightly differently. Typically, this anomaly happens at irregular intervals of two to seven years, and lasts nine months to two years. The average period length is five years. When this warming occurs for seven to nine months, it is classified as El Niño "conditions"; when its duration is longer, it is classified as an El Niño "episode".
The first signs of an El Niño are a weakening of the Walker circulation or trade winds and strengthening of the Hadley circulation and may include:

  1. Rise in surface pressure over the Indian Ocean, Indonesia, and Australia
  2. Fall in air pressure over Tahiti and the rest of the central and eastern Pacific Ocean
  3. Trade winds in the south Pacific weaken or head east
  4. Warm air rises near Peru, causing rain in the northern Peruvian deserts
El Niño's warm rush of nutrient-poor water heated by its eastward passage in the Equatorial Current, replaces the cold, nutrient-rich surface water of the Humboldt Current.
A recent study has appeared applying network theory to the analysis of El Niño events; the study presented evidence that the dynamics of a described "climate network" were very sensitive to such events, with many links in the network failing during the events.

Effects of ENSO warm phase (El Niño)

Economic impact
When El Niño conditions last for many months, extensive ocean warming and the reduction in easterly trade winds limits upwelling of cold nutrient-rich deep water, and its economic impact to local fishing for an international market can be serious.

More generally, El Niño can affect commodity prices and the macroeconomy of different countries. It can constrain the supply of rain-driven agricultural commodities; reduce agricultural output, construction, and services activities; create food-price and generalised inflation; and may trigger social unrest in commodity-dependent poor countries that primarily rely on imported food. A University of Cambridge Working Paper shows that while Australia, Chile, Indonesia, India, Japan, New Zealand and South Africa face a short-lived fall in economic activity in response to an El Niño shock, other countries may actually benefit from an El Niño weather shock (either directly or indirectly through positive spillovers from major trading partners), for instance, Argentina, Canada, Mexico and the United States. Furthermore, most countries experience short-run inflationary pressures following an El Niño shock, while global energy and non-fuel commodity prices increase The IMF estimates a significant El Niño can boost the GDP of the United States by about about 0.5% (due largely to lower heating bills) and reduce the GDP of Indonesia by about 1.0%.

Health and social impacts
Extreme weather conditions related to the El Niño cycle correlate with changes in the incidence of epidemic diseases.
For example, the El Niño cycle is associated with increased risks of some of the diseases transmitted by mosquitoes, such as malaria, dengue, and Rift Valley fever. Cycles of malaria in India, Venezuela, Brazil, and Colombia have now been linked to El Niño. Outbreaks of another mosquito-transmitted disease, Australian encephalitis (Murray Valley encephalitis—MVE), occur in temperate south-east Australia after heavy rainfall and flooding, which are associated with La Niña events. A severe outbreak of Rift Valley fever occurred after extreme rainfall in north-eastern Kenya and southern Somalia during the 1997–98 El Niño.
ENSO conditions have also been related to Kawasaki disease incidence in Japan and the west coast of the United States, via the linkage to tropospheric winds across the north Pacific Ocean.
ENSO may be linked to civil conflicts. Scientists at The Earth Institute of Columbia University, having analyzed data from 1950 to 2004, suggest ENSO may have had a role in 21% of all civil conflicts since 1950, with the risk of annual civil conflict doubling from 3% to 6% in countries affected by ENSO during El Niño years relative to La Niña years.

Recent occurrences
Since 2000, El Niño events have been observed in 2002–03, 2004–05, 2006–07, 2009–10 and 2015–16.
In December 2014, the Japan Meteorological Agency declared the onset of El Niño conditions, as warmer than normal sea surface temperatures were measured over the Pacific, albeit citing the lack of atmospheric conditions related to the event. In March and May 2015 both NOAA's Climate Prediction Center (CPC) and the Australian Bureau of Meteorology respectively confirmed the arrival of weak El Niño conditions. El Niño conditions were forecast in July to intensify into strong conditions by fall and winter of 2015. In July the NOAA CPC expected a greater than 90% chance that El Niño would continue through the 2015-2016 winter and more than 80% chance to last into the 2016 spring. In addition to the warmer than normal waters generated by the El Niño conditions, the Pacific Decadal Oscillation was also creating persistently higher than normal sea surface temperatures in the northeastern Pacific. In August, the NOAA CPC predicted that the 2015 El Niño "could be among the strongest in the historical record dating back to 1950.” In mid November, NOAA reported that the temperature anomaly in the Niño 3.4 region for the 3 month average from August to October 2015 was the 2nd warmest on record with only 1997 warmer.

Relation to climate change
During the last several decades the number of El Niño events increased, although a much longer period of observation is needed to detect robust changes.
The question is, or was, whether this is a random fluctuation or a normal instance of variation for that phenomenon or the result of global climate changes as a result of global warming. A 2014 study reported a robust tendency to more frequent extreme El Niños, occurring in agreement with a separate recent model prediction for the future.
Several studies of historical data suggest the recent El Niño variation is linked to anthropogenic climate change; in accordance with the larger consensus on climate change. For example, even after subtracting the positive influence of decade-to-decade variation (which is shown to be present in the ENSO trend), the amplitude of the ENSO variability in the observed data still increases, by as much as 60% in the last 50 years.
It may be that the observed phenomenon of more frequent and stronger El Niño events occurs only in the initial phase of the climate change, and then (e.g., after the lower layers of the ocean get warmer, as well), El Niño will become weaker than it was. It may also be that the stabilizing and destabilizing forces influencing the phenomenon will eventually compensate for each other. More research is needed to provide a better answer to that question. However, a new 2014 model appearing in a research report indicated unmitigated climate change would particularly affect the surface waters of the eastern equatorial Pacific and possibly double extreme El Niño occurrences.
 

Thursday, January 7, 2016

Gamma ray

Gamma radiation, also known as gamma rays, and denoted by the Greek letter γ, refers to electromagnetic radiation of an extremely high frequency and therefore consists of high-energy photons. Gamma rays are ionizing radiation, and are thus biologically hazardous. They are classically produced by the decay of atomic nuclei as they transition from a high energy state to a lower state known as gamma decay, but may also be produced by other processes. Paul Villard, a French chemist and physicist, discovered gamma radiation in 1900, while studying radiation emitted from radium. Villard's radiation was named "gamma rays" by Ernest Rutherford in 1903.

Natural sources of gamma rays on Earth include gamma decay from naturally occurring radioisotopes, and secondary radiation from atmospheric interactions with cosmic ray particles. Rare terrestrial natural sources produce gamma rays that are not of a nuclear origin, such as lightning strikes and terrestrial gamma-ray flashes. Additionally, gamma rays are produced by a number of astronomical processes in which very high-energy electrons are produced, that in turn cause secondary gamma rays via bremsstrahlung, inverse Compton scattering, and synchrotron radiation. However, a large fraction of such astronomical gamma rays are screened by Earth's atmosphere and can only be detected by spacecraft. Gamma rays are produced by nuclear fusion in the core of stars including the Sun (such as the CNO cycle), but are absorbed or inelastically scattered by the stellar material before escaping and are not observable from Earth.
Gamma rays typically have frequencies above 10 exahertz (or >1019 Hz), and therefore have energies above 100 keV and wavelengths less than 10 picometers (10−11 meter), which is less than the diameter of an atom. However, this is not a strict definition, but rather only a rule-of-thumb description for natural processes. Electromagnetic radiation from radioactive decay of atomic nuclei is referred to as "gamma rays" no matter its energy, so that there is no lower limit to gamma energy derived from radioactive decay. This radiation commonly has energy of a few hundred keV, and almost always less than 10 MeV. In astronomy, gamma rays are defined by their energy, and no production process needs to be specified. The energies of gamma rays from astronomical sources range to over 10 TeV, an energy far too large to result from radioactive decay. A notable example is extremely powerful bursts of high-energy radiation referred to as long duration gamma-ray bursts, of energies higher than can be produced by radioactive decay. These bursts of gamma rays, thought to be due to the collapse of stars called hypernovae, are the most powerful events so far discovered in the cosmos.

General characteristics
The distinction between X-rays and gamma rays has changed in recent decades. Originally, the electromagnetic radiation emitted by X-ray tubes almost invariably had a longer wavelength than the radiation (gamma rays) emitted by radioactive nuclei. Older literature distinguished between X- and gamma radiation on the basis of wavelength, with radiation shorter than some arbitrary wavelength, such as 10−11 m, defined as gamma rays. However, with artificial sources now able to duplicate any electromagnetic radiation that originates in the nucleus, as well as far higher energies, the wavelengths characteristic of radioactive gamma ray sources vs. other types, now completely overlap. Thus, gamma rays are now usually distinguished by their origin: X-rays are emitted by definition by electrons outside the nucleus, while gamma rays are emitted by the nucleus. Exceptions to this convention occur in astronomy, where gamma decay is seen in the afterglow of certain supernovas, but other high energy processes known to involve other than radioactive decay are still classed as sources of gamma radiation.

Health effects
Gamma rays cause damage at a cellular level and are penetrating, causing diffuse damage throughout the body. However, they are less ionising than alpha or beta particles, which are less penetrating.
Low levels of gamma rays cause a stochastic health risk, which for radiation dose assessment is defined as the probability of cancer induction and genetic damage. High doses produce deterministic effects, which is the severity of acute tissue damage that is certain to happen. These effects are compared to the physical quantity absorbed dose measured by the unit gray (Gy).


Uses
Gamma rays provide information about some of the most energetic phenomena in the universe; however, they are largely absorbed by the Earth's atmosphere. Instruments aboard high-altitude balloons and satellites missions, such as the Fermi Gamma-ray Space Telescope, provide our only view of the universe in gamma rays.
Gamma-induced molecular changes can also be used to alter the properties of semi-precious stones, and is often used to change white topaz into blue topaz.
Non-contact industrial sensors commonly use sources of gamma radiation in the refining, mining, chemical, food, soaps and detergents, and pulp and paper industries, for the measurement of levels, density, and thicknesses. Typically, these use Co-60 or Cs-137 isotopes as the radiation source.
In the US, gamma ray detectors are beginning to be used as part of the Container Security Initiative (CSI). These machines are advertised to be able to scan 30 containers per hour.
Gamma radiation is often used to kill living organisms, in a process called irradiation. Applications of this include the sterilization of medical equipment (as an alternative to autoclaves or chemical means), the removal of decay-causing bacteria from many foods and the prevention of the sprouting of fruit and vegetables to maintain freshness and flavor.Despite their cancer-causing properties, gamma rays are also used to treat some types of cancer, since the rays kill cancer cells also. In the procedure called gamma-knife surgery, multiple concentrated beams of gamma rays are directed to the growth in order to kill the cancerous cells. The beams are aimed from different angles to concentrate the radiation on the growth while minimizing damage to surrounding tissues.
Gamma rays are also used for diagnostic purposes in nuclear medicine in imaging techniques. A number of different gamma-emitting radioisotopes are used. For example, in a PET scan a radiolabeled sugar called fludeoxyglucose emits positrons that are annihilated by electrons, producing pairs of gamma rays that highlight cancer as the cancer often has a higher metabolic rate than the surrounding tissues. The most common gamma emitter used in medical applications is the nuclear isomer technetium-99m which emits gamma rays in the same energy range as diagnostic X-rays. When this radionuclide tracer is administered to a patient, a gamma camera can be used to form an image of the radioisotope's distribution by detecting the gamma radiation emitted (see also SPECT). Depending on which molecule has been labeled with the tracer, such techniques can be employed to diagnose a wide range of conditions (for example, the spread of cancer to the bones via bone scan).

Body response
When gamma radiation breaks DNA molecules, a cell may be able to repair the damaged genetic material, within limits. However, a study of Rothkamm and Lobrich has shown that this repair process works well after high-dose exposure but is much slower than in the case of a low-dose exposure.

Risk assessment
The natural outdoor exposure in Great Britain ranges from 0.1 to 0.5 µSv/h with significant increase around known nuclear and contaminated sites. Natural exposure to gamma rays is about 1 to 2 mSv per year, and the average total amount of radiation received in one year per inhabitant in the USA is 3.6 mSv. There is a small increase in the dose, due to naturally occurring gamma radiation, around small particles of high atomic number materials in the human body caused by the photoelectric effect.
By comparison, the radiation dose from chest radiography (about 0.06 mSv) is a fraction of the annual naturally occurring background radiation dose. A chest CT delivers 5 to 8 mSv. A whole-body PET/CT scan can deliver 14 to 32 mSv depending on the protocol. The dose from fluoroscopy of the stomach is much higher, approximately 50 mSv (14 times the annual yearly background).
An acute full-body equivalent single exposure dose of 1 Sv (1000 mSv) causes slight blood changes, but 2.0–3.5 Sv (2.0–3.5 Gy) causes very severe syndrome of nausea, hair loss, and hemorrhaging, and will cause death in a sizable number of cases—-about 10% to 35% without medical treatment. A dose of 5 Sv (5 Gy) is considered approximately the LD50 (lethal dose for 50% of exposed population) for an acute exposure to radiation even with standard medical treatment. A dose higher than 5 Sv (5 Gy) brings an increasing chance of death above 50%. Above 7.5–10 Sv (7.5–10 Gy) to the entire body, even extraordinary treatment, such as bone-marrow transplants, will not prevent the death of the individual exposed (see Radiation poisoning). (Doses much larger than this may, however, be delivered to selected parts of the body in the course of radiation therapy.)
For low dose exposure, for example among nuclear workers, who receive an average yearly radiation dose of 19 mSv, the risk of dying from cancer (excluding leukemia) increases by 2 percent. For a dose of 100 mSv, the risk increase is 10 percent. By comparison, risk of dying from cancer was increased by 32 percent for the survivors of the atomic bombing of Hiroshima and Nagasaki
 
 
 


 
Google SEO sponsored by Red Dragon Electric Cigarette Products