Stanley–Reisner ring: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Yobot
m WP:CHECKWIKI error fixes / special characters in pagetitle using AWB (9485)
 
Line 1: Line 1:
 
While [http://Www.Dict.cc/englisch-deutsch/evaporated+shower.html evaporated shower] head water boss water softener reviews salt is highly soluble, unlike salt-based saltless softeners which require higher amount of soft water is that hard water. First you have intentions of investing in one specific portion of the equipment to serve the large household, the fluid is received.<br><br>My web blog ... [http://secondwatersoftenerslizard.officialgottagotravel.net/ different types of water softener systems]
{{Other uses}}
 
'''Risk''' is the potential of losing something of value, weighed against the potential to gain something of value. Values (such as [[physical health]], [[social status]], emotional well being or financial wealth) can be gained or lost when taking risk resulting from a given action, activity and/or inaction, foreseen or unforeseen. Risk can also be defined as the intentional interaction with [[uncertainty]]. [[Risk perception]] is the subjective judgment people make about the severity of a risk, and may vary person to person. Any human endeavor carries some risk, but some are much riskier than others.<ref>Hansson, Sven Ove, "Risk", The Stanford Encyclopedia of Philosophy (Winter 2012 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/win2012/entries/risk/>.</ref>
 
== Definitions ==
[[File:3 Alarm Building Fire.jpg|thumb|Firefighters at work]]
 
'''Risk''' can be defined in a variety of ways.
 
=== Basic definitions ===
 
# The probability of something happening multiplied by the resulting cost or benefit if it does. (This concept is more properly known as the 'Expectation Value' and is used to compare levels of risk)
# The probability or threat of quantifiable damage, injury, liability, loss, or any other negative occurrence that is caused by external or internal vulnerabilities, and that may be avoided through preemptive action.
# [[Finance]]: The probability that an actual return on an investment will be lower than the expected return. [[Financial risk]] can be divided into the following categories: Basic risk, [[Capital risk]], [[Country risk]], [[Default risk]], [[Settlement risk|Delivery risk]], Economic risk, [[Exchange rate risk]], [[Interest rate risk]], [[Liquidity risk]], [[Operational risk|Operations risk]], Payment system risk, [[Political risk]], [[Refinancing risk]], [[Reinvestment risk]], [[Settlement risk]], Sovereign risk, and [[Underwriting#Risk, exclusivity, and reward|Underwriting risk]].
# [[Food industry]]: The possibility that due to a certain hazard in food there will be an negative effect to a certain magnitude.
# [[Insurance]]: A situation where the probability of a variable (such as burning down of a building) is known but when a mode of occurrence or the actual value of the occurrence (whether the fire will occur at a particular property) is not. A risk is not an uncertainty (where neither the probability nor the mode of occurrence is known), a peril (cause of loss), or a hazard (something that makes the occurrence of a peril more likely or more severe).
# Securities trading: The probability of a loss or drop in value. Trading risk is divided into two general categories: (1) [[Systematic risk]] affects all securities in the same class and is linked to the overall capital-market system and therefore cannot be eliminated by diversification. Also called [[market risk]]. (2) Nonsystematic risk is any risk that isn't market-related or is not systemic. Also called nonmarket risk, extra-market risk, diversifiable risk, or unsystemic risk.
# [[Workplace]]: Product of the consequence and probability of a hazardous event or phenomenon. For example, the risk of developing cancer is estimated as the incremental probability of developing cancer over a lifetime as a result of exposure to potential carcinogens (cancer-causing substances).
 
=== International Organization for Standardization ===
 
The [[ISO 31000]] (2009) / ISO Guide 73:2002 definition of risk is the 'effect of uncertainty on objectives'. In this definition, uncertainties include events (which may or may not happen) and uncertainties caused by ambiguity or a lack of information. It also includes both negative and positive impacts on objectives. Many definitions of risk exist in common usage, however this definition was developed by an international committee representing over 30 countries and is based on the input of several thousand subject matter experts.
 
=== Other ===
 
The many inconsistent and ambiguous meanings attached to "risk" lead to widespread confusion and also mean that very different approaches to risk management are taken in different fields.<ref>Douglas Hubbard ''The Failure of Risk Management: Why It's Broken and How to Fix It'', John Wiley & Sons, 2009.</ref> For example:<ref>E.g. "Risk is the unwanted subset of a set of uncertain outcomes." (Cornelius Keating).</ref>
 
:Risk can be seen as relating to the [[Probability]] of uncertain future events.<ref name=FAIRW/>  For example, according to [[Factor Analysis of Information Risk]], risk is:<ref name=FAIRW>[http://www.riskmanagementinsight.com/media/docs/FAIR_introduction.pdf "An Introduction to Factor Analysis of Information Risk (FAIR)", Risk Management Insight LLC, November 2006];.</ref> the probable frequency and probable magnitude of future loss.  In computer science this definition is used by [[The Open Group]].<ref name=OGC081>Technical Standard Risk Taxonomy ISBN 1-931624-77-1 Document Number: C081 Published by The Open Group, January 2009.</ref>
 
:OHSAS (Occupational Health & Safety Advisory Services) defines risk as the product of the probability of a hazard resulting in an adverse event, times the severity of the event.<ref>"Risk is a combination of the likelihood of an occurrence of a hazardous event or exposure(s) and the severity of injury or ill health that can be caused by the event or exposure(s)" (OHSAS 18001:2007).</ref>
 
:In [[information security]] risk is defined as "the potential that a given [[threat (computer)|threat]] will exploit [[vulnerability (computing)|vulnerabilities]] of an [[asset]] or group of assets and thereby cause harm to the organization".<ref>ISO/IEC 27005:2008.</ref>
 
:[[Financial risk]] is often defined as the unexpected variability or [[Volatility (finance)|volatility]] of returns, and this would include both potential better than expected as well as worse than expected returns. References to negative risk below should be read as applying to positive impacts or opportunity (e.g. for "loss" read "loss or gain") unless the context precludes this interpretation.
 
:The related terms "[[threat]]" and "[[hazard]]" are often used to mean something that could cause harm.
 
== History ==
 
The [[Oxford English Dictionary]] cites the earliest use of the word in English (in the spelling of ''risque'') as from 1621, and the spelling as ''risk'' from 1655. It defines ''risk'' as:
 
<blockquote>(Exposure to) the possibility of loss, injury, or other adverse or unwelcome circumstance; a chance or situation involving such a possibility.<ref>[[Oxford English Dictionary]]</ref></blockquote>
 
For the sociologist [[Niklas Luhmann]] the term 'risk' is a neologism that appeared with the transition from traditional to modern society.<ref name="Luhm96-3">Luhmann 1996:3.</ref> "In the [[Middle Ages]] the term ''risicum'' was used in highly specific contexts, above all sea trade and its ensuing legal problems of loss and damage."<ref name = "Luhm96-3"/><ref name = "Fran01-274">James Franklin, 2001: ''The Science of Conjecture: Evidence and Probability Before Pascal'', Baltimore: Johns Hopkins University Press, 274.</ref> In the [[Vernacular|vernacular languages]] of the 16th century the words ''rischio'' and ''riezgo'' were used.<ref name = "Luhm96-3"/> This was introduced to continental Europe, through interaction with Middle Eastern and North African Arab traders. In the [[English language]] the term ''risk'' appeared only in the 17th century, and "seems to be imported from continental Europe."<ref name = "Luhm96-3"/> When the terminology of ''risk'' took ground, it replaced the older notion that thought "in terms of good and bad [[Luck|fortune]]."<ref name = "Luhm96-3"/> Niklas Luhmann (1996) seeks to explain this transition: "Perhaps, this was simply a loss of plausibility of the old rhetorics of ''[[Fortuna]]'' as an allegorical figure of religious content and of ''prudentia'' as a (noble) virtue in the emerging commercial society."<ref name = "Luhm96-4">Luhmann 1996:4.</ref>
In other words, Risk is when you take a chance at something which can either turn out better for you or could result in a negative outcome.
 
[[Scenario analysis]] matured during [[Cold War]] confrontations between [[major power]]s, notably the [[United States]] and the [[Soviet Union]]. It became widespread in insurance circles in the 1970s when major [[oil spill|oil tanker disasters]] forced a more comprehensive foresight.{{Citation needed|date=February 2007}} The scientific approach to risk entered finance in the 1960s with the advent of the [[capital asset pricing model]] and became increasingly important in the 1980s when [[derivative (finance)|financial derivatives]] proliferated. It reached general professions in the 1990s when the power of personal computing allowed for widespread data collection and numbers crunching.
 
Governments are using it, for example, to set standards for [[environmental regulation]], e.g. "[[pathway analysis]]" as practiced by the [[United States Environmental Protection Agency]].
 
==Practice areas==
Risk is ubiquitous in all areas of life and risk management is something that we all must do, whether we are managing a major organization or simply crossing the road.  When describing risk however, it is convenient to consider that risk practitioners operate in some specific practice areas.
 
===Economic risk===
Economic risks can be manifested in lower incomes or higher expenditures than expected. The causes can be many, for instance, the hike in the price for [[raw material]]s, the lapsing of deadlines for construction of a new operating facility, disruptions in a production process, emergence of a serious competitor on the market, the loss of key personnel, the change of a political regime, or natural disasters.<ref>[http://ssrn.com/abstract=1012812].</ref>
 
Additionally, it is worth noting that from a societal standpoint, losses are much more lucrative than gains, as governmental bodies will do anything it takes, according to recent research, to avoid losing or resorting to an inferior position.<ref>{{Cite book|author=Nichols R |chapter=Risk |title=Working Hypotheses |edition=40 |location=New York |year=2000 |page=4}}</ref>
 
===Health===
Risks in personal health may be reduced by [[primary prevention]] actions that decrease early causes of illness or by [[secondary prevention]] actions after a person has clearly measured clinical signs or symptoms recognized as risk factors. Tertiary [[prevention (medical)|prevention]] reduces the negative impact of an already established disease by restoring function and reducing disease-related complications. Ethical medical practice requires careful discussion of [[risk factors]] with individual patients to obtain [[informed consent]] for secondary and tertiary prevention efforts, whereas public health efforts in primary prevention require education of the entire population at risk. In each case, careful communication about risk factors, likely outcomes and [[certainty]] must distinguish between causal events that must be decreased and associated events that may be merely consequences rather than causes.
 
In epidemiology, the '''lifetime risk''' of an effect is the ''[[cumulative incidence]]'', also called ''incidence proportion'' over an entire lifetime.<ref>{{cite journal |journal=J Epidemiol Community Health |author=Rychetnik L, Hawe P, Waters E, Barratt A, Frommer M |date=2004 July |volume=58 |pages=538–45 |doi=10.1136/jech.2003.011585 |pmid=15194712 |pmc=1732833 |title=A glossary for evidence based public health |issue=7 }}</ref>
 
===Health, safety, and environment===
Health, safety, and environment (HSE) are separate practice areas; however, they are often linked. The reason for this is typically to do with organizational management structures; however, there are strong links among these disciplines. One of the strongest links between these is that a single risk event may have impacts in all three areas, albeit over differing timescales. For example, the uncontrolled release of radiation or a toxic chemical may have immediate short-term safety consequences, more protracted health impacts, and much longer-term environmental impacts. Events such as Chernobyl, for example, caused immediate deaths, and in the longer term, deaths from cancers, and left a lasting environmental impact leading to birth defects, impacts on wildlife, etc. Over time, a form of risk analysis called environmental risk analysis has developed. Environmental risk analysis is a field of study that attempts to understand events and activities that bring risk to human health or the environment.<ref name="Environmental Risk Analysis: Problems and Perspectives in Different Countries">{{cite journal|last=Gurjar|first=Bhola Ram|coauthors=Mohan, Manju|title=Environmental Risk Analysis: Problems and Perspectives in Different Countries|journal=Risk: Health, Safety & Environment|year=2002|volume=13|page=3|url=http://heinonline.org.myaccess.library.utoronto.ca/HOL/Page?handle=hein.journals/risk13&id=5&collection=journals&index=journals/risk|accessdate=23 March 2013}}</ref>
 
===Information technology and information security===
{{Main|IT risk}}
'''Information technology risk''', or '''IT risk''', '''IT-related risk''', is a risk related to [[information technology]]. This relatively new term due to an increasing awareness that information security is simply one facet of a multitude of risks that are relevant to IT and the real world processes it supports.
 
The increasing dependencies of modern society on information and computers networks (both in private and public sectors, including military)<ref>
{{cite book |last1= Cortada|first1= James W.|authorlink1= |last2= |first2= |authorlink2= |editor1-first= |editor1-last= |editor1-link= |others= |title= The Digital Hand: How Computers Changed the Work of American Manufacturing, Transportation, and Retail Industries |trans_title= |url= |archiveurl= |archivedate= |format= |accessdate= |type= |edition= |series= |volume= |date=2003-12-04 |year= |month= |origyear= |publisher= Oxford University Press|location=USA|isbn= 0-19-516588-8 |oclc= |doi= |id= |pages= 512|at= |trans_chapter= |chapter= |chapterurl= |quote= |ref= |bibcode= |laysummary= |laydate= |separator= |postscript= |lastauthoramp=}}</ref><ref>{{cite book |last1= Cortada|first1= James W.|authorlink1= |last2= |first2= |authorlink2= |editor1-first= |editor1-last= |editor1-link= |others= |title= The Digital Hand: Volume II: How Computers Changed the Work of American Financial, Telecommunications, Media, and Entertainment Industries |trans_title= |url= |archiveurl= |archivedate= |format= |accessdate= |type= |edition= |series= |volume= |date=2005-11-03 |year= |month= |origyear= |publisher= Oxford University Press|location=USA|isbn= 978-0-19-516587-6 |oclc= |doi= |id= |page= |pages= |at= |trans_chapter= |chapter= |chapterurl= |quote= |ref= |bibcode= |laysummary= |laydate= |separator= |postscript= |lastauthoramp=}}</ref><ref>{{cite book |last1= Cortada|first1= James W.|authorlink1= |last2= |first2= |authorlink2= |editor1-first= |editor1-last= |editor1-link= |others= |title= The Digital Hand, Vol 3: How Computers Changed the Work of American Public Sector Industries  |trans_title= |url= |archiveurl= |archivedate= |format= |accessdate= |type= |edition= |series= |volume= |date=2007-11-06 |year= |month= |origyear= |publisher= Oxford University Press|location=USA|isbn= 978-0-19-516586-9 |oclc= |doi= |id= |pages= 496|at= |trans_chapter= |chapter= |chapterurl= |quote= |ref= |bibcode= |laysummary= |laydate= |separator= |postscript= |lastauthoramp=}}</ref> has led to a new terms like [[IT risk]] and [[Cyberwarfare]].
 
{{Main|Information assurance|Information security}}
'''Information security''' means protecting information and [[information system]]s from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction.<ref>{{usc|44|3542}}(b)(1).</ref> Information security grew out of practices and procedures of [[computer security]].<br />
Information security has grown to '''information assurance (IA)''' i.e. is the practice of managing risks related to the use, processing, storage, and transmission of information or data and the systems and processes used for those purposes. <br />
While focused dominantly on information in digital form, the full range of IA encompasses not only digital but also analog or physical form. <br />
Information assurance is interdisciplinary and draws from multiple fields, including [[accounting]], [[fraud]] examination, [[forensic science]], [[management science]], [[systems engineering]], [[security engineering]], and [[criminology]], in addition to computer science.
 
So, ''[[#Information technology risk|IT risk]]'' is narrowly focused on computer security, while ''information security'' extends on risks related to other forms of information (paper, microfilm). ''Information assurance'' risks include the ones related to the consistency of the business information stored in IT systems and the one stored on other means and the relevant business consequences.
 
===Insurance===
[[Insurance]] is a risk treatment option which involves risk sharing. It can be considered as a form of contingent capital and is akin to purchasing an [[Option (finance)]] in which the buyer pays a small premium to be protected from a potential large loss.
 
Insurance Risk is often taken by insurance companies, who then bear a pool of risks including market risk, credit risk, operational risk, interest rate risk, mortality risk, longevity risks, etc.<ref>James M. Carson; Elyas Elyasiani; Iqbal Mansur(December 2008), "Market Risk, Interest Rate Risk, and Interdependencies in Insurer Stock Returns: A System-GARCH Model", ''The Journal of Risk and Insurance'', {{ISSN|0022-4367}}, 12/2008, Volume 75, Issue 4, pp. 873–891, doi: 10.1111/j.1539-6975.2008.00289.x</ref>
 
===Business and management===
Means of assessing risk vary widely between professions. Indeed, they may define these professions; for example, a doctor manages medical risk, while a civil engineer manages risk of structural failure. A [[professional]] [[code of ethics]] is usually focused on risk assessment and mitigation (by the professional on behalf of client, public, society or life in general).
 
In the workplace, incidental and inherent risks exist. Incidental risks are those that occur naturally in the business but are not part of the core of the business. Inherent risks have a negative effect on the operating profit of the business.
 
===In human services===
The experience of many people who rely on human services for support is that 'risk' is often used as a reason to prevent them from gaining further independence or fully accessing the community, and that these services are often unnecessarily risk averse.<ref>A Positive Approach To Risk Requires Person Centred Thinking, Neill et al, Tizard Learning Disability Review http://pierprofessional.metapress.com/content/vr700311x66j0125/</ref> "People's autonomy used to be compromised by institution walls, now it's too often our risk management practices" [[John O'Brien (human services thinker)|John O'Brien]].<ref>John O'Brien cited in Sanderson, H. Lewis, J. A Practical Guide to Delivering Personalisation; Person Centred Practice in Health and Social Care p211</ref> [http://oxford.academia.edu/MichaelFischer Fischer] and [http://kcl.academia.edu/EwanFerlie Ferlie] (2013) find that contradictions between formal risk controls and the role of subjective factors in human services (such as the role of emotions and ideology) can undermine service values, so producing tensions and even intractable and 'heated' conflict.<ref>{{cite journal|last=Fischer|first=Michael Daniel|coauthors=Ferlie, Ewan|title=Resisting hybridisation between modes of clinical risk management: Contradiction, contest, and the production of intractable conflict|journal=Accounting, Organizations and Society|date=1 January 2013|volume=38|issue=1|pages=30–49|doi=10.1016/j.aos.2012.11.002}}</ref>
 
===High reliability organizations (HROs)===
A [[High reliability organization]] (HRO) is an organization that has succeeded in avoiding catastrophes in an environment where [[normal accident]]s can be expected due to risk factors and [[Complex system|complexity]].  Most studies of HROs involve areas such as nuclear aircraft carriers, air traffic control, aerospace and nuclear power stations. Organizations such as these share in common the ability to consistently operate safely in complex, interconnected environments where a single failure in one component could lead to catastrophe. Essentially, they are organizations which appear to operate 'in spite' of an enormous range of risks.
 
Some of these industries manage risk in a highly quantified and enumerated way. These include the [[nuclear power]] and [[Aerospace manufacturer|aircraft industries]], where the possible failure of a complex series of engineered systems could result in highly undesirable outcomes. The usual measure of risk for a class of events is then: ''R'' = probability of the event × the severity of the consequence.
 
The total risk is then the sum of the individual class-risks;  see below.{{Citation needed|reason=A previous version of this said, "product of ... class-risks".  This conflicted with, e.g., the formula below|date=August 2013}}
 
In the nuclear industry, consequence is often measured in terms of off-site radiological release, and this is often banded into five or six decade-wide bands.{{Clarify|reason=define "decade-wide bands":  factor of 10 regarding what?|date=August 2013}}
 
The risks are evaluated using fault tree/event tree techniques (see [[safety engineering]]). Where these risks are low, they are normally considered to be "Broadly Acceptable". A higher level of risk (typically up to 10 to 100 times what is considered Broadly Acceptable) has to be justified against the costs of reducing it further and the possible benefits that make it tolerable—these risks are described as "Tolerable if [[ALARP]]". Risks beyond this level are classified as "Intolerable".
 
The level of risk deemed Broadly Acceptable has been considered by regulatory bodies in various countries—an early attempt by UK government regulator and academic [[F. R. Farmer]] used the example of hill-walking and similar activities, which have definable risks that people appear to find acceptable. This resulted in the so-called Farmer Curve of acceptable probability of an event versus its consequence.
 
The technique as a whole is usually referred to as Probabilistic Risk Assessment (PRA) (or Probabilistic Safety Assessment, PSA). See [[WASH-1400]] for an example of this approach.
 
===Finance===
{{Main|Financial risk}}
In [[finance]], risk is the chance that the return achieved on an investment will be different from that expected, and also takes into account the size of the difference. This includes the possibility of losing some or all of the original investment. In a view advocated by Damodaran, risk includes not only "[[downside risk]]" but also "upside risk" (returns that exceed expectations).<ref>{{cite book |title=Investment Philosophies: Successful Investment Philosophies and the Greatest Investors Who Made Them Work |first=Aswath |last=Damodaran |page=15 |publisher=Wiley |year=2003 |isbn=0-471-34503-2}}</ref> Some regard the [[standard deviation]] of the historical returns or average returns of a specific investment as providing some historical measure of risk; see [[modern portfolio theory]]. Financial risk may be market-dependent, determined by numerous market factors, or operational, resulting from fraudulent behavior (e.g. [[Bernard Madoff]]). Recent studies suggest that testosterone level plays a major role in risk-taking in financial decision-making.<ref>Sapienza P., Zingales L. and Maestripieri D. 2009. Gender differences in financial risk aversion and career choices are affected by testosterone. Proceedings of the National Academy of Sciences.</ref><ref>Apicella C. L. and all.  Testosterone and financial risk preferences. Evolution and Human Behavior. Vol 29. Issue 6. 384–390.[http://www.ehbonline.org/article/S1090-5138%2808%2900067-6/abstract abstract].</ref>
 
In finance, ''risk'' has no single definition, but some theorists, notably [[Ron Dembo]], have defined quite general methods to assess risk as an expected after-the-fact level of regret. Such methods have been uniquely successful in limiting [[rate risk|interest rate risk]] in [[financial markets]]{{citation needed|date=January 2013}}. Financial markets are considered to be a proving ground for general methods of risk assessment{{citation needed|date=January 2013}}. However, these methods are also hard to understand. The mathematical difficulties interfere with other social goods such as disclosure, [[Valuation (finance)|valuation]] and [[transparency (humanities)|transparency]]{{citation needed|date=January 2013}}. In particular, it is not always obvious if such [[financial instruments]] are "[[Hedge (finance)|hedging]]" (purchasing/selling a financial instrument specifically to reduce or cancel out the risk in another investment) or "[[speculation]]" (increasing measurable risk and exposing the investor to catastrophic loss in pursuit of very high windfalls that increase expected value).
 
As [[regret (emotion)|regret]] measures rarely reflect actual human risk aversion, it is difficult to determine if the outcomes of such transactions will be satisfactory. Some people may be "risk seeking", i.e. their utility function's second derivative is positive. Such an individual willingly pays a premium to assume risk (e.g. buys a lottery ticket).
 
In financial markets, one may need to measure [[credit risk]], information timing and source risk, probability model risk, and [[legal risk]] if there are regulatory or civil actions taken as a result of some "[[investor's regret]]". Knowing one's risk appetite in conjunction with one's financial well-being are important.
 
A fundamental idea in finance is the relationship between risk and return (see [[modern portfolio theory]]). The greater the potential return one might seek, the greater the risk that one generally assumes. A free market reflects this principle in the pricing of an instrument: strong demand for a safer instrument drives its price higher (and its return correspondingly lower) while weak demand for a riskier instrument drives its price lower (and its potential return thereby higher).
 
{{quote|For example, a US Treasury bond is considered to be one of the safest investments and, when compared to a corporate bond, provides a lower rate of return. The reason for this is that a corporation is much more likely to go bankrupt than the U.S. government. Because the risk of investing in a corporate bond is higher, investors are offered a higher rate of return.}}
 
The most popular, and also the most vilified lately, [[risk measure]]ment is [[Value-at-Risk]] (VaR). There are different types of VaR - Long Term VaR, Marginal VaR, Factor VaR and Shock VaR. The latter is used in measuring risk during the extreme market stress conditions.
 
===Security===
[[File:AT YOUR OWN RISK.svg|thumb|166px|'''AT YOUR OWN RISK'''<br />Most popular Labeling regulations.]]
[[Security risk]] management involves protection of assets from harm caused by deliberate acts. A more detailed definition is: "A security risk is any event that could result in the compromise of organizational assets i.e. the unauthorized use, loss, damage, disclosure or modification of organizational assets for the profit, personal interest or political interests of individuals, groups or other entities constitutes a compromise of the asset, and includes the risk of harm to people. Compromise of organizational assets may adversely affect the enterprise, its business units and their clients. As such, consideration of security risk is a vital component of risk management." <ref>Julian Talbot and Miles Jakeman ''Security Risk Management Body of Knowledge'', John Wiley & Sons, 2009.</ref>
 
The following sections from ISO/IEC Guide 73:2002 are related with risk
: 3.9  Residual Risk
: 3.10 Risk acceptance
: 3.11 Risk Analysis
: 3.12 Risk Assessment
: 3.13 Risk Evaluation
: 3.14 Risk Management
: 3.15 Risk Treatment",<ref>ISO/IEC27001 ''Information technology – Security techniques – Information security management systems – Requirements''</ref>
 
===Human factors===
{{Main|Decision theory|Prospect theory}}
One of the growing areas of focus in risk management is the field of [[human factors]] where behavioral and organizational psychology underpin our understanding of risk based decision making. This field considers questions such as "how do we make risk based decisions?", "why are we irrationally more scared of sharks and terrorists than we are of motor vehicles and medications?"
 
In [[decision theory]], regret (and anticipation of regret) can play a significant part in decision-making, distinct from [[risk aversion]] (preferring the status quo in case one becomes worse off).
 
[[Framing (social sciences)|Framing]]<ref>Amos Tversky / Daniel Kahneman, 1981. "The Framing of Decisions and the Psychology of Choice."{{Verify source|date=October 2008}}</ref> is a fundamental problem with all forms of risk assessment. In particular, because of [[bounded rationality]] (our brains get overloaded, so we take mental shortcuts), the risk of extreme events is discounted because the probability is too low to evaluate intuitively. As an example, one of the leading causes of death is [[road accident]]s caused by [[Driving under the influence|drunk driving]] – partly because any given driver frames the problem by largely or totally ignoring the risk of a serious or fatal accident.
 
For instance, an extremely disturbing event (an attack by hijacking, or [[moral hazard]]s) may be ignored in analysis despite the fact it has occurred and has a nonzero probability. Or, an event that everyone agrees is inevitable may be ruled out of analysis due to greed or an unwillingness to admit that it is believed to be inevitable. These human tendencies for error and [[wishful thinking]] often affect even the most rigorous applications of the [[scientific method]] and are a major concern of the [[philosophy of science]].
 
All [[Decision theory#Choice under uncertainty|decision-making under uncertainty]] must consider [[cognitive bias]], [[cultural bias]], and notational bias: No group of people assessing risk is immune to "[[groupthink]]": acceptance of obviously wrong answers simply because it is socially painful to disagree, where there are [[conflicts of interest]].
 
Framing involves other information that affects the outcome of a risky decision.  The right prefrontal cortex has been shown to take a more global perspective<ref>Schatz, J., Craft, S., Koby, M., & DeBaun, M. R. (2004).  Asymmetries in visual-spatial processing following childhood stroke.  Neuropsychology, 18, 340–352.</ref> while greater left prefrontal activity relates to local or focal processing<ref>Volberg, G., & Hubner, R. (2004).  On the role of response conflicts and stimulus position for hemispheric differences in global/local processing: An ERP study.  Neuropsychologia, 42, 1805–1813.</ref>
 
From the Theory of Leaky Modules<ref>Drake, R. A. (2004).  Selective potentiation of proximal processes: Neurobiological mechanisms for spread of activation.  Medical Science Monitor, 10, 231–234.</ref> McElroy and Seta proposed that they could predictably alter the framing effect by the selective manipulation of regional prefrontal activity with finger tapping or monaural listening.<ref>McElroy, T., & Seta, J. J. (2004).  On the other hand am I rational? Hemisphere activation and the framing effect.  Brain and Cognition, 55, 572–580.</ref>  The result was as expected.  Rightward tapping or listening had the effect of narrowing attention such that the frame was ignored.  This is a practical way of manipulating regional cortical activation to affect risky decisions, especially because directed tapping or listening is easily done.
 
==Risk assessment and analysis==
{{Main|Risk assessment|Operational risk management}}
Since risk assessment and management is essential in security management, both are tightly related. Security assessment methodologies like [[CRAMM]] contain risk assessment modules as an important part of the first steps of the methodology. On the other hand, risk assessment methodologies like [[Mehari]] evolved to become security assessment methodologies.
An [[International Organization for Standardization|ISO]] standard on risk management (Principles and guidelines on implementation) was published under code [[ISO 31000]] on 13 November 2009.
 
===Quantitative analysis===
As risk carries so many different meanings there are many formal methods used to assess or to "measure" risk.  Some of the quantitative definitions of risk are well-grounded in statistics theory and lead naturally to statistical estimates, but some are more subjective. For example in many cases a critical factor is human [[decision making]].
 
Even when statistical estimates are available, in many cases risk is associated with rare failures of some kind, and data may be sparse. Often, the probability of a negative event is estimated by using the frequency of past similar events or by [[event tree]] methods, but probabilities for rare failures may be difficult to estimate if an event tree cannot be formulated. This makes risk assessment difficult in hazardous industries, for example nuclear energy, where the frequency of failures is rare and harmful consequences of failure are numerous and severe.
 
Statistical methods may also require the use of a [[Loss function|cost function]], which in turn may require the calculation of the cost of loss of a human life. This is a difficult problem. One approach is to ask what people are willing to pay to insure against death<ref>{{Cite news|first = Steven|last = Landsburg|title = Is your life worth $10 million?|url = http://www.slate.com/id/2079475/|work = Everyday Economics|publisher = Slate|date = 2003-03-03|accessdate = 2008-03-17}}</ref> or radiological release (e.g. GBq of radio-iodine) {{Citation needed|date=March 2012}}, but as the answers depend very strongly on the circumstances it is not clear that this approach is effective.
 
In statistics, the notion of risk is often modeled as the [[expected value]] of an undesirable outcome. This combines the probabilities of various possible events and some assessment of the corresponding harm into a single value. See also [[Expected utility]].  The simplest case is a binary possibility of ''Accident'' or ''No accident''. The associated formula for calculating risk is then:
 
:<math> \text{Risk} = (\text{probability of the accident occurring}) \times  (\text{expected loss in case of the accident})</math>
 
For example, if performing activity ''X'' has a probability of 0.01 of suffering an accident of ''A'', with a loss of 1000, then total risk is a loss of 10, the product of 0.01 and 1000.
 
Situations are sometimes more complex than the simple binary possibility case. In a situation with several possible accidents, total risk is the sum of the risks for each different accident, provided that the outcomes are comparable:
 
:<math> \text{Risk} =  \sum_\text{For all accidents} (\text{probability of the accident occurring}) \times  (\text{expected loss in case of the accident})</math>
 
For example, if performing activity ''X'' has a probability of 0.01 of suffering an accident of ''A'', with a loss of 1000, and a probability of 0.000001 of suffering an accident of type ''B'', with a loss of 2,000,000, then total loss expectancy is 12, which is equal to a loss of 10 from an accident of type ''A'' and 2 from an accident of type ''B''.
 
One of the first major uses of this concept was for the planning of the [[Delta Works]] in 1953, a flood protection program in the [[Netherlands]], with the aid of the mathematician [[David van Dantzig]].<ref>[[Wired Magazine]], [http://www.wired.com/science/planetearth/magazine/17-01/ff_dutch_delta?currentPage=3 Before the levees break], page 3.</ref> The kind of risk analysis pioneered there has become common today in fields like nuclear power, [[aerospace]] and the [[chemical industry]].
 
In statistical decision theory, the [[risk function]] is defined as the expected value of a given [[loss function]] as a function of the [[decision rule]] used to make decisions in the face of uncertainty.
 
===Fear as intuitive risk assessment===
For the time being, people rely on their [[fear processing in the brain|fear]] and hesitation to keep them out of the most profoundly unknown circumstances.
 
In ''[[The Gift of Fear]]'', [[Gavin de Becker]] argues that <blockquote>True fear is a gift. It is a survival signal that sounds only in the presence of danger. Yet unwarranted fear has assumed a power over us that it holds over no other creature on Earth. It need not be this way.</blockquote>
Risk could be said to be the way we collectively measure and share this "true fear"—a fusion of rational doubt, irrational fear, and a set of unquantified biases from our own experience.
 
The field of [[behavioral finance]] focuses on human risk-aversion, asymmetric regret, and other ways that human financial behavior varies from what analysts call "rational". Risk in that case is the degree of [[uncertainty]] associated with a [[return (finance)|return]] on an asset.
 
Recognizing and respecting the irrational influences on human decision making may do much to reduce disasters caused by naive risk assessments that presume to rationality but in fact merely fuse many shared biases.
 
==Anxiety, risk and decision making==
 
===Fear, anxiety and risk===
While fear is a fleeting emotion ascribed to a particular object, [[anxiety]] is a trait of fear that lasts longer and is not attributed to a specific stimulus.<ref>Catherine A. Hartley, Elizabeth A. Phelps, Anxiety and Decision-Making, Biological Psychiatry, Volume 72, Issue 2, 15 July 2012, pp. 113–118, {{ISSN|0006-3223}}, 10.1016/j.biopsych.2011.12.027.</ref>  Studies show a link between anxious behavior and risk, the chance that an outcome will have an unfavorable result.<ref>Jon Gertner. What Are We Afraid Of, Money 32.5 (2003): 80.</ref>  Joseph Forgas introduced [[valence (psychology)|valence]] based research where emotions are grouped as either positive or negative (Lerner and Keltner, 2000). Positive emotions, such as happiness, are believed to have more optimistic risk assessments and negative emotions, such as anger, have pessimistic risk assessments.  As an emotion with a negative valence, fear, and therefore anxiety, has long been associated with negative risk perceptions.  Under the more recent appraisal tendency framework of Jennifer Lerner et al., which refutes Forgas’ notion of valence and promotes the idea that specific emotions have distinctive influences on judgments, fear is still related to pessimistic expectations.<ref>Jennifer S. Lerner, Dacher Keltner. Beyond Valence: Toward A Model of Emotion-Specific Influences on Judgment and Choice. Cognition & Emotion 14.4 (2000): 473–493.</ref>  Psychologists have demonstrated that increases in anxiety and increases in [[risk perception]] are related and people who are habituated to anxiety experience this awareness of risk more intensely than normal individuals.<ref name="Jon">Jon K. Maner, Norman B. Schmidt, The Role of Risk Avoidance in Anxiety, Behavior Therapy, Volume 37, Issue 2, June 2006, pp. 181–189, {{ISSN|0005-7894}}, 10.1016/j.beth.2005.11.003.</ref>  In decision-making, anxiety promotes the use of biases and quick thinking to evaluate risk.  This is referred to as affect-as-information according to Clore, 1983.  However, the accuracy of these risk perceptions when making choices is not known.<ref name="Joseph">Joseph I. Constans, Worry propensity and the perception of risk, Behaviour Research and Therapy, Volume 39, Issue 6, June 2001, pp. 721–729, {{ISSN|0005-7967}}, 10.1016/S0005-7967(00)00037-1.</ref>
 
=== Consequences of anxiety ===
Experimental studies show that brief surges in anxiety are correlated with surges in general risk perception.<ref name="Joseph" />  Anxiety exists when the presence of threat is perceived (Maner and Schmidt, 2006).<ref name="Jon" />  As risk perception increases, it stays related to the particular source impacting the mood change as opposed to spreading to unrelated risk factors.<ref name="Joseph" />  This increased awareness of a threat is overemphasized in people who are conditioned to anxiety.<ref name="Jon_a">Jon K. Maner, J. Anthony Richey, Kiara Cromer, Mike Mallott, Carl W. Lejuez, Thomas E. Joiner, Norman B. Schmidt, Dispositional anxiety and risk-avoidant decision-making, Personality and Individual Differences, Volume 42, Issue 4, March 2007, pp. 665–675, {{ISSN|0191-8869}}, 10.1016/j.paid.2006.08.016.</ref>  For example, anxious individuals who are predisposed to generating reasons for negative results tend to exhibit pessimism.<ref name="Jon_a" />    Also, findings suggest that the perception of a lack of control and a lower inclination to participate in risky decision-making (across various behavioral circumstances) is associated with individuals experiencing relatively high levels of trait anxiety.<ref name="Jon" />  In the previous instance, there is supporting clinical research that links emotional evaluation (of control), the anxiety that is felt and the option of risk avoidance.<ref name="Jon" />
 
There are various views presented that anxious emotions cause people to access involuntary responses and judgments when making decisions that involve risk.  Joshua A. Hemmerich et al. probes deeper into anxiety and its impact on choices by exploring “risk-as-feelings” which are quick, automatic, and natural reactions to danger that are based on emotions.  This notion is supported by an experiment that engages physicians in a simulated perilous surgical procedure.  It was demonstrated that the anxiety about patient outcomes was related to previous regret and worry and ultimately caused the physicians to be led by their feelings over any information or guidelines provided during the mock surgery.  Additionally, their emotional levels, adjusted along with the simulated patient status, suggest that anxiety and a respective decision is specific to the type of bad outcome.<ref>Joshua A. Hemmerich, Arthur S. Elstein, Margaret L. Schwarze, Elizabeth Ghini Moliski, William Dale, Risk as feelings in the effect of patient outcomes on physicians' future treatment decisions: A randomized trial and manipulation validation, Social Science &amp; Medicine, Volume 75, Issue 2, July 2012, pp. 367–376, {{ISSN|0277-9536}}, 10.1016/j.socscimed.2012.03.020.</ref>  Similarly, another view of anxiety and decision-making is dispositional anxiety where emotional states, or [[mood (psychology)|moods]], are cognitive and provide information about future pitfalls and rewards (Maner and Schmidt, 2006).  When experiencing anxiety, individuals draw from personal judgments referred to as pessimistic outcome appraisals.  These emotions promote biases for risk avoidance and promote risk tolerance in decision-making.<ref name="Jon_a" />
 
===Dread risk===
It is common for people to dread some risks but not others: They tend to be very afraid of epidemic diseases, nuclear power plant failures, and plane accidents but are relatively unconcerned about some highly frequent and deadly events, such as traffic crashes, household accidents, and medical errors. One key distinction of dreadful risks seems to be their potential for catastrophic consequences,<ref name="Slovic P 1987">Slovic P (1987) Perception of risk. Science 236:280−285.</ref> threatening to kill a large number of people within a short period of time.<ref>Gigerenzer G (2004) Dread risk, September 11, and fatal traffic accidents. Psych Sci 15:286−287.</ref> For example, immediately after the September 11 attacks, many Americans were afraid to fly and took their car instead, a decision that led to a significant increase in the number of fatal crashes in the time period following the 9/11 event compared with the same time period before the attacks.<ref>Gaissmaier, W., & Gigerenzer, G. (2012). 9/11, Act II: A fine-grained analysis of regional variations in traffic fatalities in the aftermath of the terrorist attacks. Psychological Science, 23, 1449–1454.</ref><ref name="Lichtenstein S 1978">Lichtenstein S, Slovic P, Fischhoff B, Layman M, Combs B (1978) Judged frequency of lethal events. J Exp Psych HLM 4:551–578.</ref>
Different hypotheses have been proposed to explain why people fear dread risks. First, the psychometric paradigm <ref name="Slovic P 1987"/> suggests that high lack of control, high catastrophic potential, and severe consequences account for the increased risk perception and anxiety associated with dread risks. Second, because people estimate the frequency of a risk by recalling instances of its occurrence from their social circle or the media, they may overvalue relatively rare but dramatic risks because of their overpresence and undervalue frequent, less dramatic risks.<ref name="Lichtenstein S 1978"/> Third, according to the preparedness hypothesis, people are prone to fear events that have been particularly threatening to survival in human evolutionary history.<ref>Öhman A, Mineka S (2001) Fears, phobias, and preparedness: Toward an evolved module of fear and fear learning. Psychol Rev 108:483–522.</ref> Given that in most of human evolutionary history people lived in relatively small groups, rarely exceeding 100 people,<ref>Hill KR, Walker RS, Bozicevic M, Eder J, Headland T et al. (2011) Co-residence patterns in hunter-gatherer societies show unique human social structure. Science 331:1286–1289.</ref> a dread risk, which kills many people at once, could potentially wipe out one’s whole group. Indeed research found <ref>Galesic M, Garcia-Retamero, R (2012) The risks we dread: A social circle account. PLoS ONE 7(4): e32837.</ref> that people’s fear peaks for risks killing around 100 people but does not increase if larger groups are killed. Fourth, fearing dread risks can be an ecologically rational strategy.<ref>Bodemer, N., Ruggeri, A., & Galesic, M. (2013). When dread risks are more dreadful than continuous risks: Comparing cumulative population losses over time. PLoS One, 8, e66544.</ref> Besides killing a large number of people at a single point in time, dread risks reduce the number of children and young adults who would have potentially produced offspring. Accordingly, people are more concerned about risks killing younger, and hence more fertile, groups.<ref>Wang XT (1996) Evolutionary hypotheses of risk-sensitive choice: Age differences and perspective change. Ethol Sociobiol 17:1–15.</ref>
 
===Anxiety and judgmental accuracy===
It remains unclear if higher levels of risk perception in anxious individuals results in decreased “judgmental accuracy” (Joseph I. Constans, 2001).  There is a chance that “judgmental accuracy” is correlated to heightened anxiety.  However, Constans conducted a study where anxiety (and worry) in college student’s estimation of their performance on an upcoming exam showed errors in their risk assessments.<ref name="Joseph" />   Moreover, it is noted that with high levels of anxiety that are not attributed to anything in particular, the probability and degree of suffering associated with a negative experience is misjudged.<ref name="Jon" />
 
==Risk in auditing==
The [[audit risk|audit risk model]] expresses the risk of an [[auditor]] providing an inappropriate opinion of a commercial entity's financial statements. It can be analytically expressed as:
 
: AR = IR x CR x DR
 
Where AR is ''audit risk'', IR is ''inherent risk'', CR is ''control risk'' and DR is ''detection risk''.
 
==Other considerations==
Another consideration in terms of managing risk, is that risks are future problems that can be treated, rather than current ones that must be immediately addressed.
 
===Risk versus uncertainty===
In his seminal work ''Risk, Uncertainty, and Profit'', [[Frank Knight]] (1921) established the distinction between risk and uncertainty.
{{quote|... Uncertainty must be taken in a sense radically distinct from the familiar notion of Risk, from which it has never been properly separated. The term "risk," as loosely used in everyday speech and in economic discussion, really covers two things which, functionally at least, in their causal relations to the phenomena of economic organization, are categorically different. ... The essential fact is that "risk" means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in the bearings of the phenomenon depending on which of the two is really present and operating. ... It will appear that a measurable uncertainty, or "risk" proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. We ... accordingly restrict the term "uncertainty" to cases of the non-quantitive type.:<ref>Frank Hyneman Knight "Risk, uncertainty and profit"
pg. 19, Hart, Schaffner, and Marx Prize Essays, no. 31. Boston and New York: Houghton Mifflin. 1921.</ref>}}
 
Thus, [[Knightian uncertainty]] is immeasurable, not possible to calculate, while in the Knightian sense risk is measurable.
 
Another distinction between risk and uncertainty is proposed in ''How to Measure Anything: Finding the Value of Intangibles in Business'' and ''The Failure of Risk Management: Why It's Broken and How to Fix It'' by [[Doug Hubbard]]:<ref>Douglas Hubbard "How to Measure Anything: Finding the Value of Intangibles in Business" pg. 46, John Wiley & Sons, 2007.</ref><ref name = "ReferenceA">Douglas Hubbard "The Failure of Risk Management: Why It's Broken and How to Fix It, John Wiley & Sons, 2009.</ref>
 
:'''Uncertainty''': The lack of complete certainty, that is, the existence of more than one possibility. The "true" outcome/state/result/value is not known.
 
:'''Measurement of uncertainty''': A set of probabilities assigned to a set of possibilities. Example: "There is a 60% chance this market will double in five years"
 
:'''Risk''': A state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome.
 
:'''Measurement of risk''': A set of possibilities each with quantified probabilities and quantified losses. Example: "There is a 40% chance the proposed oil well will be dry with a loss of $12 million in exploratory drilling costs".
 
In this sense, Hubbard uses the terms so that one may have uncertainty without risk but not risk without uncertainty. We can be uncertain about the winner of a contest, but unless we have some personal stake in it, we have no risk. If we bet money on the outcome of the contest, then we have a risk. In both cases there are more than one outcome. The measure of uncertainty refers only to the probabilities assigned to outcomes, while the measure of risk requires both probabilities for outcomes and losses quantified for outcomes.
 
===Risk attitude, appetite and tolerance===
The terms attitude, appetite and tolerance are often used similarly to describe an organization's or individual's attitude towards risk taking. Risk averse, risk neutral and risk seeking are examples of the terms that may be used to describe a risk attitude. Risk tolerance looks at acceptable/unacceptable deviations from what is expected. Risk appetite looks at how much risk one is willing to accept. There can still be deviations that are within a risk appetite. For example, recent research finds that insured individuals are significantly likely to divest from risky asset holdings in response to a decline in health, controlling for variables such as income, age, and out-of-pocket medical expenses.<ref>[http://www.chicagofed.org/digital_assets/publications/working_papers/2009/wp2009_23.pdf Federal Reserve Bank of Chicago, ''Health and the Savings of Insured versus Uninsured, Working-Age Households in the U.S.'', November 2009]</ref>
 
[[Gambling]] is a risk-increasing investment, wherein money on hand is risked for a possible large return, but with the possibility of losing it all. Purchasing a lottery ticket is a very risky investment with a high chance of no return and a small chance of a very high return. In contrast, putting money in a bank at a defined rate of interest is a risk-averse action that gives a guaranteed return of a small gain and precludes other investments with possibly higher gain.  The possibility of getting no return on an investment is also known as the [[Rate of Ruin]].
 
===Risk as a vector quantity===
Hubbard also argues that defining risk as the product of impact and probability presumes (probably incorrectly) that the decision makers are [[risk neutral]].<ref name = "ReferenceA"/> Only for a risk neutral person is the "certain monetary equivalent" exactly equal to the probability of the loss times the amount of the loss. For example, a risk neutral person would consider 20% chance of winning $1 million exactly equal to $200,000 (or a 20% chance of losing $1 million to be exactly equal to losing $200,000). However, most decision makers are not actually risk neutral and would not consider these equivalent choices. This gave rise to [[Prospect theory]] and [[Cumulative prospect theory]]. Hubbard proposes instead that risk is a kind of "[[Array data structure|vector]] quantity" that does not collapse the probability and magnitude of a risk by presuming anything about the risk tolerance of the decision maker. Risks are simply described as a set or function of possible loss amounts each associated with specific probabilities. How this array is collapsed into a single value cannot be done until the risk tolerance of the decision maker is quantified.
 
Risk can be both negative and positive, but it tends to be the negative side that people focus on. This is because some things can be dangerous, such as putting their own or someone else’s life at risk. Risks concern people as they think that they will have a negative effect on their future.
 
== List of related books ==
 
This is a '''list of books about risk''' issues.
 
{| class="wikitable"
! Title
! Author(s)
! Year
|-
| Acceptable risk || Baruch Fischhoff, Sarah Lichtenstein, [[Paul Slovic]], Steven L. Derby, and Ralph Keeney || 1984
|-
| Against the Gods: The Remarkable Story of Risk || [[Peter L. Bernstein]] || 1996
|-
| American hazardscapes: The regionalization of hazards and disasters || Susan L. Cutter  || 2001
|-
| At risk: Natural hazards, people's vulnerability and disasters || [[Piers Blaikie]], Terry Cannon, Ian Davis, and Ben Wisner || 1994
|-
| Big dam foolishness; the problem of modern flood control and water storage || Elmer Theodore Peterson || 1954
|-
| [[Building Safer Communities. Risk Governance, Spatial Planning and Responses to Natural Hazards]] || Urbano Fra Paleo || 2009
|-
| Catastropic coastal storms: Hazard mitigation and development management || David R. Godschalk, David J. Brower, and Timothy Beatley || 1989
|-
| Cities on the beach: management issues of developed coastal barriers || Rutherford H. Platt, Sheila G. Pelczarski, and Barbara K. Burbank || 1987
|-
| Cooperating with nature: Confronting natural hazards with land-use planning for sustainable communities || Raymond J. Burby || 1998
|-
| Dangerous earth: An introduction to geologic hazards || Barbara W. Murck, Brian J. Skinner, Stephen C. Porter ||1998
|-
| Disasters and democracy || Rutherford H. Platt || 1999
|-
| Disasters by design: A reassessment of natural hazards in the United States || Dennis Mileti || 1999
|-
| Disasters: The anatomy of environmental hazards || John Whittow || 1980
|-
| Divine wind: The history and science of hurricanes || [[Kerry Emanuel]] || 2005
|-
| Earth shock: Hurricanes, volcanoes, earthquakes, tornadoes and other forces of nature || [[W. Andrew Robinson]] || 1993
|-
| Earthquakes: A primer || [[Bruce Bolt]] || 1976
|-
| Environmental hazards: Assessing risk and reducing disaster || Keith Smith || 1992
|-
| Facing the unexpected: Disaster preparedness and response in the United States || Kathleen J. Tierney, Michael K. Lindell, and  Ronald W. Perry || 2001
|-
| Floods || Dennis J. Parker || 2000
|-
| Human adjustment to floods || [[Gilbert F. White]] || 1942
|-
| Human System Response to Disaster: An Inventory of Sociological Findings || Thomas E. Drabek || 1986
|-
| Hurricanes : their nature and impacts on society. || [[Roger A. Pielke]], [[Roger A. Pielke, Jr.]], Sr. || 1997
|-
| Judgment under uncertainty: heuristics and biases || [[Daniel Kahneman]], [[Paul Slovic]], and [[Amos Tversky]] || 1982
|-
| Mapping vulnerability: disasters, development, and people || Greg Bankoff, Georg Frerks, and Dorothea Hilhorst || 2004
|-
| Man and Society in Calamity: The Effects of War, Revolution, Famine, Pestilence upon Human Mind, Behavior, Social Organization and Cultural Life || [[Pitirim Sorokin]] || 1942
|-
| Mitigation of hazardous comets and asteroids || Michael J.S. Belton, Thomas H. Morgan, Nalin H. Samarasinha, Donald K. Yeomans || 2005
|-
| Mountains of fire: The nature of volcanoes || Robert W. Decker, Barbara B. Decker || 1991
|-
| Natural disasters || David Alexander || 1993
|-
| Natural disasters || Patrick L. Abbott || 1991
|-
| Natural disasters: Protecting vulnerable communities || Paul A. Merriman, and C.W. A. Browitt || 1993
|-
| Natural disaster hotspots: a global risk analysis  || Maxx Dilley  || 2005
|-
| Natural hazard mitigation: Recasting disaster policy and planning || David Godschalk, [[Timothy Beatley]], Philip Berke, David Brower, and Edward J. Kaiser || 1999
|-
| Natural hazards || Edward Bryant || 1991
|-
| Natural hazards: Earth’s processes as hazards, disasters, and catastrophes || Edward A. Keller, and Robert H. Blodgett || 2006
|-
| Natural hazards: Explanation and integration || Graham A. Tobin, and Burrell E. Montz || 1997
|-
| Natural hazards: Local, national, global || [[Gilbert F. White]] || 1974
|-
| Normal accidents. Living with high-risk technologies || [[Charles Perrow]] || 1984
|-
| On borrowed land: Public policies for floodplains || Faber Scott || 1993
|-
| Paying the price: The status and role of insurance against natural disasters in the United States || Howard Kunreuther, and Richard J. Roth || 1998
|-
| Planning for earthquakes:  Risks, politics, and policy || Philip R. Berke, and Timothy Beatley || 1992
|-
| Promoting Risk: Constructing the Earthquake Threat || Robert Stallings || 1995
|-
| Reconstruction Following Disaster || J. Eugene Haas, [[Robert Kates]], and Martyn J. Bowden || 1977
|-
| Recovery from Natural Disasters: Insurance or Federal Aid? || Howard Kunreuther || 1973
|-
| Reduction and predictability of natural disasters || John B. Rundle, William Klein, Don L. Turcotte || 1996
|-
| Regions of risk:  A geographical introduction to disasters || Kenneth Hewitt || 1997
|-
| Risk analysis: a quantitative guide || David Vose || 2008
|-
| Risk and culture: An essay on the selection of technical and environmental dangers || [[Mary Douglas]], and [[Aaron Wildavsky]] || 1982
|-
| Risk communication: A handbook for communicating environmental, safety, and health risks || Regina E. Lundgren, and Andrea H. McMakin || 1994
|-
| Risk society: Towards a new modernity || [[Ulrich Beck]] || 1992
|-
| Risk, environment and modernity: towards a new ecology || [[Scott Lash]], Bronislaw Szerszynski and Brian W. Sage || 1996
|-
| Socially Responsible Engineering: Justice in Risk Management (ISBN 978-0-471-78707-5) || [[Daniel A. Vallero]], and P. Aarne Vesilind || 2006
|-
| Swimming with Crocodiles: The Culture of Extreme Drinking
| Marjana Martinic and Fiona Measham (eds.)
| 2008
|-
| Terra non firma: Understanding and preparing for earthquakes
| James M. Gere and Haresh M. Shah
| 1984
|-
| The angry earth: Disaster in anthropological perspective
|| Anthony Oliver-Smith, and Susanna Hoffman || 1999
|-
| The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA || Diane Vaughan || 1997
|-
| [[The Control of Nature]] || [[John McPhee]] || 1989
|-
| The hurricane and its impact || Robert H. Simpson, and Herbert Riehl || 1981
|-
| The environment as hazard || Ian Burton, [[Robert Kates]], and [[Gilbert F. White]] || 1978
|-
| The perception of risk || [[Paul Slovic]] || 2000
|-
| The social amplification of risk || Nick Pidgeon, Roger E. Kasperson, and [[Paul Slovic]] || 2003
|-
| There is no such thing as a natural disaster : race, class, and Hurricane Katrina || Chester W. Hartman, and Gregory D. Squires || 2006
|-
| Understanding catastrophe:  Its impact on life on earth || Janine Bourrian || 1992
|-
| What is a disaster? New answers to old questions || Ronald W. Perry, and [[Enrico Quarantelli]] || 2005
|-
| What is a disaster? Perspectives on the question || [[Enrico Quarantelli]] || 1998
|-
| The laws of Fear. || [[Cass Sunstein]] || 2005
|-
| The social theory of fear  || [[Geoffrey Skoll]] || 2010
|-
| Floods: From Risk to Opportunity ([[IAHS]] Red Book Series)  || Ali Chavoshian, and Kuniyoshi Takeuchi || 2013
|}
 
==See also==
{{div col|colwidth=30em}}
* [[Applied information economics]]
* [[Ambiguity]]
* [[Ambiguity aversion]]
* [[Benefit shortfall]]
* [[Civil defense]]
* [[Countermeasure]]
* [[Cultural Theory of risk]]
* [[Disaster]]
* [[Early case assessment]]
* [[Emergency]]
* [[Event chain methodology]]
* [[Fuel price risk management]]
* [[Fuzzy-trace theory]]
* [[Global Risk Forum GRF Davos]]
* [[Hazard (risk)]]
* [[Identity resolution]]
* [[Information assurance]]
* [[Inherent risk (accounting)]]
* [[International Risk Governance Council]]
* [[ISO/PAS 28000]]
* [[Life-critical system]]
* [[Loss aversion]]
* [[Preventive maintenance]]
* [[Probabilistic risk assessment]]
* [[Reputational risk]]
* [[Reliability engineering]]
* [[Risk analysis (business)|Risk analysis]]
* [[Risk compensation]]
* [[Risk management]]
* [[Risk-neutral measure]]
* [[Risk register]]
* [[Sampling risk]]
* [[Vulnerability]]
{{div col end}}
 
==References==
{{Reflist|2}}
 
==Bibliography==
 
===Referred literature===
* [[James Franklin (philosopher)|James Franklin]], 2001: ''The Science of Conjecture: Evidence and Probability Before Pascal'', Baltimore: Johns Hopkins University Press.
* [[Niklas Luhmann]], 1996: ''Modern Society Shocked by its Risks'' (= University of Hong Kong, Department of Sociology Occasional Papers 17), Hong Kong, available via [http://hub.hku.hk/handle/10722/42552 HKU Scholars HUB]
 
===Books===
* Historian [[David A. Moss]]' book [http://www.hup.harvard.edu/catalog/MOSWHE.html ''When All Else Fails''] explains the [[U.S. government]]'s historical role as risk manager of last resort.
* Peter L. Bernstein. ''Against the Gods'' ISBN 0-471-29563-9. Risk explained and its appreciation by man traced from earliest times through all the major figures of their ages in mathematical circles.
* {{Cite book|last = Rescher|first = Nicholas|title = A Philosophical Introduction to the Theory of Risk Evaluation and Measurement|publisher = University Press of America|year = 1983}}
* {{Cite book|last = Porteous|first = Bruce T.|coauthors = Pradip Tapadar|title = Economic Capital and Financial Risk Management for Financial Services Firms and Conglomerates|publisher = Palgrave Macmillan|date=December 2005|isbn = 1-4039-3608-0}}
* {{Cite book|author = Tom Kendrick|year = 2003|title = Identifying and Managing Project Risk: Essential Tools for Failure-Proofing Your Project|publisher = AMACOM/American Management Association|isbn = 978-0-8144-0761-5}}
* {{Cite book|author = David Hillson|year = 2007|title = Practical Project Risk Management: The Atom Methodology|publisher = Management Concepts|isbn = 978-1-56726-202-5}}
* {{Cite book|author = Kim Heldman|year = 2005|title = Project Manager's Spotlight on Risk Management|publisher = Jossey-Bass|isbn = 978-0-7821-4411-6}}
* {{Cite book|author = Dirk Proske|year = 2008|title = Catalogue of risks - Natural, Technical, Social and Health Risks|publisher = Springer|isbn = 978-3-540-79554-4}}
* Gardner, Dan, [http://books.google.com/books?id=5j_8xF8vUlAC&printsec=frontcover ''Risk: The Science and Politics of Fear''], Random House, Inc., 2008. ISBN 0-7710-3299-4.
* Hopkin, Paul "Fundamentals of Risk Management 2nd Edition" Kogan-Page (2012) ISBN 978-0-7494-6539-1
 
===Articles and papers===
* Clark, L., Manes, F., Antoun, N., [[Barbara Sahakian|Sahakian, B. J.]], & Robbins, T. W. (2003). "The contributions of lesion laterality and lesion volume to decision-making impairment following frontal lobe damage." ''Neuropsychologia'', 41, 1474–1483.
* Cokely, E. T., Galesic, M., Schulz, E., Ghazal, S., & Garcia-Retamero, R. (2012). [http://journal.sjdm.org/11/11808/jdm11808.pdf Measuring risk literacy: The Berlin Numeracy Test.] ''Judgment and Decision Making, 7,'' 25–47.
* Drake, R. A. (1985). "Decision making and risk taking: Neurological manipulation with a proposed consistency mediation." ''Contemporary Social Psychology, 11,'' 149–152.
* Drake, R. A. (1985). "Lateral asymmetry of risky recommendations." ''Personality and Social Psychology Bulletin, 11,'' 409–417.
* Gregory, Kent J., Bibbo, Giovanni and Pattison, John E. (2005), "A Standard Approach to Measurement Uncertainties for Scientists and Engineers in Medicine", ''Australasian Physical and Engineering Sciences in Medicine'' '''28'''(2):131–139.
* Hansson, Sven Ove. (2007). [http://plato.stanford.edu/entries/risk/ "Risk"], ''The Stanford Encyclopedia of Philosophy'' (Summer 2007 Edition), Edward N. Zalta (ed.), forthcoming [http://plato.stanford.edu/archives/sum2007/entries/risk/].
* Holton, Glyn A. (2004). [http://www.riskexpertise.com/papers/risk.pdf "Defining Risk"], ''Financial Analysts Journal'', 60 (6), 19–25. A paper exploring the foundations of risk. (PDF file).
* Knight, F. H. (1921) ''Risk, Uncertainty and Profit'', Chicago: Houghton Mifflin Company. (Cited at: [http://www.econlib.org/library/Knight/knRUP1.html], § I.I.26.).
* Kruger, Daniel J., Wang, X.T., & Wilke, Andreas (2007) [http://www.epjournal.net/filestore/ep05555568.pdf "Towards the development of an evolutionarily valid domain-specific risk-taking scale"] ''Evolutionary Psychology'' (PDF file).
* Metzner-Szigeth, A. (2009). "Contradictory Approaches? – On Realism and Constructivism in the Social Sciences Research on Risk, Technology and the Environment." ''Futures'', Vol. 41, No. 2, March 2009, pp.&nbsp;156–170 (fulltext journal: [http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6V65-4TGS7JY-1&_user=10&_coverDate=04%2F30%2F2009&_rdoc=1&_fmt=high&_orig=search&_sort=d&_docanchor=&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=054fec1f03e9ec784596add85197d2a8]) (free preprint: [http://egora.uni-muenster.de/ifs/personen/bindata/metznerszigeth_contradictory_approaches_preprint.PDF]).
* Miller, L. (1985). "Cognitive risk taking after frontal or temporal lobectomy I. The synthesis of fragmented visual information."  ''Neuropsychologia'', 23, 359–369.
* Miller, L., & Milner, B. (1985). "Cognitive risk taking after frontal or temporal lobectomy II. The synthesis of phonemic and semantic information." ''Neuropsychologia'', 23, 371–379.
* Neill, M. Allen, J. Woodhead, N. Reid, S. Irwin, L. Sanderson, H. 2008 "A Positive Approach to Risk Requires Person Centred Thinking" London, CSIP Personalisation Network, Department of Health. Available from: http://networks.csip.org.uk/Personalisation/Topics/Browse/Risk/ [Accessed 21 July 2008].
* {{cite encyclopedia |last1=Wildavsky|first1=Aaron |authorlink1=Aaron Wildavsky|last2=Wildavsky|first2=Adam|editor=[[David R. Henderson]] |encyclopedia=[[Concise Encyclopedia of Economics]] |title=Risk and Safety |url=http://www.econlib.org/library/Enc/RiskandSafety.html  |year=2008 |edition= 2nd |publisher=[[Library of Economics and Liberty]] |location=Indianapolis |isbn=978-0865976658 |oclc=237794267}}
 
==External links==
{{Sisterlinks|Risk}}
* [http://plato.stanford.edu/entries/risk/ Risk] – The entry of the Stanford Encyclopedia of Philosophy
* [http://ibcsr.org/index.php?option=com_content&view=article&id=149:risk-preference-and-religiosity&catid=25:research-news&Itemid=59 "Risk preference and religiosity"] article from the Institute for the Biocultural Study of Religion
 
{{Environmental social science|state=collapsed}}
{{Use dmy dates|date=September 2010}}
 
{{DEFAULTSORT:Risk}}
[[Category:Risk| ]]
[[Category:Actuarial science]]
[[Category:Economics of uncertainty]]
[[Category:Underwater diving safety]]

Latest revision as of 11:10, 31 December 2014

While evaporated shower head water boss water softener reviews salt is highly soluble, unlike salt-based saltless softeners which require higher amount of soft water is that hard water. First you have intentions of investing in one specific portion of the equipment to serve the large household, the fluid is received.

My web blog ... different types of water softener systems