All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

Towards a Credible Forecasting Process for Sustainable User Innovation

Xavier Fernandez-i-Marin*

ESADE Business School-Barcelona, Spain

*Corresponding Author:
Xavier Fernandez-i-Marin
ESADE Business School-Barcelona, Spain
Tel: +34 93 2806162
E-mail:
xavier.fernandez3@esade.edu

Received: 23/10/2015 Accepted: 09/12/2015 Published: 18/12/2015

Visit for more related articles at Research & Reviews: Journal of Social Sciences

Abstract

The document presents a review of the challenges that arise when forecasting techniques are applied to predict the evolution of sustainable user innovation. It also provides an augmented list of variables that may be used in the process of envisioning the future of lifestyles in Europe. Forecasting any kind of individual and social behaviour requires assembling several elements from different disciplines: from mere technical methodological challenges (choice of model) to substantial theoretical discussions (prediction of outcomes); from data gathering strategies (combination of sources and their reliability) to measurement (Choice of variables used to represent the relevant ideas); from establishing the rules of micro-behaviour of individuals to using well established models for individual interactions. This document is a review of measurement indicators in sustainability models, focusing on available models for forecasting the future of natural environments, ecosystems, climate change, urban mobility and demographic patterns. In the context of a large European research project there is a need to perform systematic forecasts of several Dimensions of individual lifestyles Therefore, the purpose is to provide the building blocks in which a systematic and credible forecasting of user innovation in Europe in 2030/2050 can be built.

Keywords

Forecasting, Lifestyle, Agent-based models, Sustainable user innovation.

Introduction

Forecasting any kind of individual and social behaviour requires assembling several elements from different disciplines: from mere technical methodological challenges (choice of model) to substantial theoretical discussions (prediction of outcomes); from data gathering strategies (combination of sources and their reliability) to measurement (choice of variables used to represent the relevant ideas); from establishing the rules of micro behaviour of individuals to using well established models for individual interactions. This document is a review of measurement indicators in sustainability models, focusing on available models for forecasting the future of natural environments, ecosystems, climate change, urban mobility and demographic patterns. The purpose is to provide the building blocks in which a systematic and credible forecasting of user innovation in Europe in 2030/2050 can be built. The document proceeds as follows: Section 2 presents different models for forecasting the future, with special emphasis on quantitative models. Section 3 reviews sources of data from several institutions that are working in the field of providing forecasts for the issues of interest. Finally, Section 4 provides arguments for a selection of the best measurement indicators needed for forecasting the future in living, energy, food and transportation, by using the forecasting models considered.

Review Of Forecasting Models

This section provides an introduction to forecasting models. Forecasting involves making predictions about the future based on a combined analysis of trends and facts in the past and the present. Understanding the logic behind different systems of forecasting is important for two reasons: first, to select the most appropriate models given the nature of the prediction considered; and second, to understand the nature of inputs (data) that each modelling technique requires.

Time-series-based models

Time-series based models are the classical approaches to forecasting. This is what it is known as “forecasting” in general, and there is even a Journal of Forecasting that understands those models as forecasting per se. But basically the idea of this approach to forecasting is based on two principles: data a model. The complexity of the outcome to be predicted is what determines the amount and quality of necessary data. Whether forecasts are probably the most widely used forecasts and an extreme case of complexity. They need a large amount of data of the past, a good knowledge of how the climate system works (thermodynamics), a complex set of equations, and very powerful computers to compute the outcomes. Time series analysis helps to identify and explain any regularity or systematic variation in the series of data which is due to seasonality, to analyse cyclical patterns that are repeated over regular periods of time, to predict trends in the data and also growth rates of these trends. The set of models that have been most influential include the relatively simple Autoregressive (AR) or Moving average (MA) to the most complex Kaman filters. Montgomery [1], Harvey [2], and Makridakis et al. [3], Autoregressive models are used when there is a substantial time trend in the short run in the data series. Moving average are specially suitable for predicting series with trends in the long run. Kalman filters are used when it is clear that the data has a strong dependency on observations at previous time points, but there is not a clear tendency of its direction Fernández-i-Marın et al. [4].

Simulation-based models

The idea with forecasting based on simulated models is to imtate the operation of a real-world process or system over time. Thus, the focus of simulation-based models is not on the behaviour of a single series of data, but on the overall behaviour of a system. The act of simulating something requires that a model be developed first; this model represents the key characteristics or behaviours/functions of the selected physical or abstract system or process. The model represents the system itself, whereas the simulation represents the operation of the system over time. A prototype or some basic ideas about the structure of the system are enough. Simulation is a much more flexible instrument than time series forecasting, as it does not require a complete timeseries of the output to observe, and it allows a more relaxed set of assumptions about the behaviour of the actors of the system. Simulation, therefore, does not explicitly require to be based on factual data, although most of simulation-based models use some sort of time-series techniques as input.

Discrete-event models

Discrete-event models are simulation tools aimed at describing the operations of a system using a discrete (zero-one, binary, yes/no, true/false) sequence of events. That is describing how a system behaves when an event occurs at a particular instant in time and marks a change of state in the system. Discrete-event simulation assumes that no change in the system occurs between consecutive events; the simulation can directly jump in time from one event to the next. The typical uses in day-to-day applications include modelling queues: customers in restaurants/hotels/etc. Other uses include modelling potential investments where decision-makers can evaluate potential alternatives. A discrete-event model requires building the following elements:

a) An initial state. A system state is a set of variables that captures the salient properties of the system to be studied

b) Time definition. The simulation must keep track of the current simulation time, in measurement units suitable for the system being modelled.

c) Events list. The simulation maintains at least one list of simulation events. This is sometimes called the “pending event set” because it lists events that are pending as a result of previously simulated events but have yet to be simulated themselves. An event is described by the time at which it occurs and a type, indicating the code that will be used to simulate that event. It is common for the event code to be parameterized, in which case, the event description also contains parameters of the event code.

d) Simulation uncertainty. Based on generation of random numbers, some uncertainty in the simulation has to be imposed.

e) Ending condition. Typically the programmer decides when to stop the simulation: at time t + x, after processing x events or when other parameters of the model reach pre-specified values. The easiest example is to conceive a queue: an individual arrives and the event “individual-arrival” at time t is noted, and departs at time t + s, where s is the duration of the service. The objective of discrete-event models is to predict the behaviour of the system that is, the assignation of resources to be used by individuals who want to perform the event.

System dynamics

System dynamics is a modelling technique for understanding and discussing issues and Problems related with complex systems. The steps required for system dynamics to work are the following:

• Define the problem boundary

• Identify the most important stocks and flows that change these stock levels

• Identify sources of information that impact the flows

• Identify the main feedback loops

• Draw a causal loop diagram that links the stocks, flows and sources of information

• Write the equations that determine the flows

• Estimate the parameters and initial conditions. These can be estimated using statistical methods, expert opinion, market research data or other relevant sources of information

• Simulate the model and analyse results.

One of the first and most famous applications of system dynamics was the 1972 book “The limits to growth” Donella H et al. [5] The aim of the study was not to make specific predictions about future trends in several indicators, but to explore how exponential growth interacts with finite resources and the internal dynamics of a system defined by such characteristics. System dynamics is not suitable for making concrete predictions. Instead, it is used to explore general trends and learn about the conditions and dynamics of a pre-defined system, therefore being addressed at complex systems, such as climate predictions.

Agent-based models

The last of the three simulation-based models considered is agent-based models Gilbert [6], Adamatti [7] Agent-based models are targeted at simulating actions and interactions of autonomous agents (whether they are individuals, collective groups (neighbourhoods, regions, countries, etc.), firms, governments, organizations) in order to assess their effects in the system. In other words, it tries to explain collective behaviour of those agents by making them comply with relatively simple rules. They are frequently used in ecology and biology to simulate the behaviour of natural systems, forests, etc. The components needed for an agent-based model to run are the following:

• Specify the agents

• Decision-making rules based on learning through discovery (trial and error)

• Learning rules or the adaptive process

• Interaction rules

• Definition of the non-agent environment

Examples of the use of agent-based models include urban models (racial segregation in American cities [8], opinion dynamics (the development of extremist opinions within a population consumer behaviour (processes of lock-in in consumer markets , industrial networks innovation networks Gilbert et al. [9], supply chain markets (study the impact of information sharing in divergent assembly supply chains Strader [10] stated energy markets decreasing the environmental impact of generation, changing the geographical distribution of electrical generation capacity in the [11], extending established models of low voltage electricity networks to a generalized energy models Durana et al. [12]. However, the most notable and complex cases of use of agent-based models are in the study of financial markets, where vasts amounts of data from agents in the financial market are easily available.

Delphi method

Aside from the literature that specifically deals with forecasts using past patterns and current status to predict the future, the Delphi technique for forecasting is also well known in many areas of interest, and has been extensively used in the context of the social sciences, especially when lack of quantitative information is present or when the nature of the system is chaotic [13-15]. For the social sciences, it has been argued that the Delphi method is superior to any other approach for forecasting Rowe [16], Landeta [17]. The Delphi method is a technique based on a procedure, not based on a set of mathematical tools for analysing the data contrasting with time-series and simulation-based models. The idea is to involve several experts in a field in a blind discussion of more than one round controlled by the researcher in order to achieve but not necessarily some sort of consensus about the prediction at hand. The Delphi method was developed and tested in the context of the Cold War by the military. More concretely, the RAND Corporation used it to investigate the scientific use of expert opinions. Studies were published on the superiority of group opinion over individual and on the justification of expert opinions in inexact sciences and its scientific use. The first case of the Delphi method outside the military was on planning in developing economies. The main characteristics of the Delphi method are the following by Bolognini [17].

a) Repetitive process. The experts must be consulted at least twice on the same question, so that they can reconsider their answer, aided by the information they receive from the rest of the experts.

b) Participant anonymity (or at least anonymity of their answers, as these go directly to the group coordinator). This means a group working process can be developed in Figure 1 with experts who do not coincide in time or space and also aims to avoid the negative influence that could be exercised by factors in the individual answers in terms of the personality and status of the participating experts.

social-sciences-Title-first-declassified-document-corporation

Figure 1: Title of the first declassified document of the RAND corporation publicly presenting the Delphi method.

c) Controlled feedback. The exchange of information between the experts is not free but is carried out by means of a study group coordinator, so that all irrelevant information is eliminated.

d) Group statistical response. All the opinions form part of the final answer. The questions are formulated so that the answers can be processed quantitatively and statistically. Although the technique is based on aggregating “qualitative” data from several experts, the outcome is a probabilistic quantitative prediction based on the consensus of the participants. At the end, the outcome can be treated as an input of other tools. The historical evolution of the method has led to current uses that tend to incorporate as many individuals as possible, turning the procedure into a purer deliberation process, less guided by a central authority and more similar to e-Democracy Bolognini [17]. A concrete example of this tendency is the use of more than 1,000 participants in several rounds to establish a Latin American intergovernmental strategy for Information and Communications Technologies as instruments for economic development and social inclusion following the Millennium Development Goals of the United Nations Hilbert et al. [18].More classical examples of the use of the Delhi method for forecasting include imputing tourist expenditure Landeta [19], identifying topics in sustainable supply chain management Seuring [20],assessing the environmental impact of tourism development Green et al. [21].

Prediction markets and the quantified self-movement

Prediction markets are a recent newcomer in the set of tools for doing systematic forecasting. The idea behind prediction markets is absolutely the opposite of the Delhi method. While the Delphi uses the knowledge of a few selected experts, the prediction markets approach is to use micro-knowledge of large amounts of individuals that aggregated on a betting market, are expected to produce the most likely outcome, based on the principle that “large groups of people are smarter than an elite few, no matter how brilliant— better at solving problems, fostering innovation, coming to wise decisions, even predicting the future” Surowiecki [22]. The principle is not far from basic axioms of classical economics: under certain conditions of information, the invisible hand of the market aggregate micro-behaviour of many individuals is the best approach to generating a price. Prediction markets have their niche usually in “trivial” outcomes like sports outcomes: who will win the race/ the match, etc. recently researchers, specially coming from the social sciences, have started to look at prediction markets as complementary tools for forecasting. Recent examples include their use as primary data for predicting election outcomes. If several individuals bet money on the winner of the election, the information can be processed and used as the best guess of a collective subject. Prediction markets have been used in the 2014 US presidential elections, and by several betting houses usually dealing with sports events. Figure 2 presents the 2014 evolution of the 2016 Presidential election betting at (Figure 2) the use of prediction markets, as it has been mentioned requires a lot of data collected from a lot of individuals. In other words, what is recently known as “Big Data” Mayer at al. [23], Manyika [24,25] The “quantified self-movement” Nafus [26] can be understood as part of this movement, where predictions are based on large amounts of data aggregated from many observations. The principle is exactly the same: pool data from many, and use a lot of micro-inputs to generate a superior prediction than based on a selected-few approach. The idea behind the quantified self-movement is to incorporate technology into data acquisition on aspects of a person’s daily life in terms of inputs (e.g. food consumed, quality of surrounding air), states (e.g. mood, arousal, blood oxygen levels), and performance (Mental and physical). Although it is not a new idea (first devices date back from the 70’s), the use of smartphones (programmable devices with multiple sensors) has boosted its visibility in the last few years. Despite the problems associated with anonymity, data acquisition by big corporations and potential ethical implications, there is a real potential use for educational, productivity and wellness improvement purposes. And potential medical uses are promising. Also, uses related to environmental health are being explored, even combined with the use of agent-based models, as the case of the Heals research project on assessing individual exposure to environmental stressors and predicting health outcomes Navigenics, and decode me. These files constitute 1–2% of the human genome and typically have 1-1.2 million records, Which are unwieldy to load and query (especially when comparing multiple files) without specific data-management tools. Whole human genome files are much larger than SNP files. Vendors Illumine and Knome ship multi-terabyte-sized files to the consumer in a virtually unusable format on a standalone computer or zip drive. In the short-term, standard cloud-based services for QS data storage, sharing, and manipulation would be extremely useful. In the long-term, big data solutions are needed to implement the vision of a systemic and continuous approach to automated, unobtrusive data collection from multiple sources that is processed into a stream of behavioural insights and interventions. Making progress in the critical contemporary challenge of preventive medicine–recognizing early warning signs and eliminating conditions during the 80% of their preclinical lifecycle-may likely require regular collection on the order of a billion data points per person. Specific big data science opportunities in data collection, integration, and analysis are discussed below in the sections data collection, data integration, data analysis, and opportunities in working with large data corpora. Data collection: big health data streams there is a need for big data scientists to facilitate the identification, collection, and storage of data streams related to QS Activity. Both traditional institutional health professionals and QS individuals are starting to find themselves in a whole new era of massively expanded data and have the attendant challenge of employing these new data streams toward pathology resolution and wellness outcomes. Big health data streams can be grouped into three categories: traditional medical data (personal and family health history, medication history, lab reports, etc.), ‘‘omics’’ data (genomics, micro bionics, proteomics, metabolomics, etc.), and quantified-self tracking data (Figure 2). A key shift is that due to the plummeting cost of sequencing and Internet-based data storage, data. As U.S. National Institutes of Health director Francis Collins remarked in 2010, ‘‘Genetics loads the gun and environment pulls the trigger’’ [31]. It is a general heuristic for common disease conditions like cancer and heart disease that genetics have a one-third contribution to outcome and environment two-thirds. There are some notable examples of QS projects involving the integration of multiple big health data streams. Self-trackers typically obtain underlying genomic and micro biomic profiling and review this information together with blood tests and proteomic tests to determine baseline levels and variability for a diversity of markers and then experiment with different interventions for optimized health and pathology reduction. Some examples of these kinds of QS data integration projects include DIY genomics studies, 33 Leroy Hood’s 4P medicine (predict). Big health data streams are becoming increasingly consumer-available (Figure 3). Some other examples of averaging data from many micro-behaviours (Figure 4), which depicts the most popular biking routes in London. This is a 2D representation that aggregates information through time. But nothing stops us from imagining other uses Based on the time tendencies, and its use afterwards to make inferences on future trends (Figure 4).

social-sciences-Presidential-election-betting-Evolution

Figure 2: 2016 US Presidential election betting. Evolution of the odds in 2015.

social-sciences-Big-health-data-streams

Figure 3: Big health data streams as analyzed by Swan, 2013.

social-sciences-Most-popular-London-cycling

Figure 4: Most popular London cycling routes based on the London’s public cycle-hire facility.

Sources of data for forecasting domains

This section introduces relevant and large-scale projects developed regarding the generation of credible and systematic forecasting of issues related with the development of the environment. The first document that produced and explored of how exponential growth interacts with finite resources is The Limits to Growth which was commissioned by the Rome Club in the 70s as a report on the limits of growth of human populations. The purpose of the project was, however, not to make specific predictions, but to explore the interactions within the system, following the logic of Without any doubt, nowadays the most relevant project towards providing systematic forecasts for series of data related to environmental issues is the Intergovernmental Panel on Climate Change (IPCC) from United Nations, which is the focus of this section.

UN: International Panel on Climate Change

The Intergovernmental Panel on Climate Change (IPCC) is an intergovernmental body created under the United Nations. It was established in 1988 by the World Meteorological Organization (WMO) and the United Nations Environment Programme (UNEP), both bodies of the United Nations system. The main international treaty on climate change, the United Nations Framework Convention on Climate Change (UNFCCC) aims to “stabilize greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system” Nations [27]. The IPCC does not generate original research, nor does it monitor climate itself. The IPCC bases its assessment on published literature, which includes peer-reviewed and non-peer-reviewed sources, by thousands of scientists and other experts who contribute to writing and reviewing reports, which are then reviewed by governments. In some sense, the IPCC can be seen as a place where relevant scientific literature is a) reviewed again and b) aggregated.

The IPCC is currently presenting its conclusions from the fifth assessment, which was concluded in 2014. Amongst the most relevant conclusions, the fifth assessment indicates that:

a) “Warming of the climate system is unequivocal, and since the 1950s, many of the observed changes are unprecedented over decades to millennia”.

b) “Atmospheric concentrations of carbon dioxide, methane, and nitrous oxide have increased to levels unprecedented in at least the last 800,000 years”.

c) Without new policies to mitigate climate change, projections suggest an increase in global mean temperature in 2100 of 3.7 to 4.8°C, relative to pre-industrial levels (median values; the range is 2.5 to 7.8°C including climate uncertainty). Figure 5 presents a map identifying the impacts attributed to climate change by region. Europe faces impacts in marine ecosystems and wildfire, and less important on food production and on livelihoods, health and economics. Figure 6 presents the regional key risks and potential for risk reduction by world regions. In Europe, the higher risks from an environmental point of view are the following: ordered by decreasing order of risk level in 2030–2040:

social-sciences-Most-popular-London-cycling

Figure 5: Wide Spread impacts attributed to climate change based on the available scientific literature Since the Assessment Report.

social-sciences-Regional-Key-risks-Potential

Figure 6: Regional Key risks and Potential for risk Reduction.

• Increased damage from extreme heat events and wildfires

• Increased water restrictions

• Increased damage from river and coastal floods

In contrast, the IPCC does not find risks on food production in Europe.

Planetary boundaries

Planetary Boundaries is based on the Planetary Boundaries framework from the Stockholm Resilience Centre. There is a TED talk about the project at Summary for Policymakers IPCC Fifth Assessment Synthesis Report Based on the available scientific literature since the AR4, there are substantially more impacts in recent decades now attributed to climate change. Attribution requires defined scientific evidence on the role of climate change. Absence from the map of additional impacts attributed to climate change does not imply that such impacts have not occurred. The publications supporting attributed impacts reflect a growing knowledge base, but publications are still limited for many regions, systems and processes, highlighting gaps in data and studies. Symbols indicate categories of attributed impacts, the relative contribution of climate change (major or minor) to the observed impact, and confidence in attribution. Each symbol refers to one or more entries in WGII SPM.A1, grouping related regional-scale impacts. Numbers in ovals indicate regional totals of climate change publications from 2001 to 2010, based on the Scopus bibliographic database for publications in English with individual countries mentioned in title, abstract or key words as of July 2011. These numbers provide an overall measure of the available scientific literature on climate change across regions; they do not indicate the number of publications supporting attribution of climate change impacts in each region. The inclusion of publications for assessment of attribution followed IPCC scientific evidence criteria defined in WGII Chapter 18. Studies for Polar Regions and small islands are grouped with neighbouring continental regions. Publications considered in the attribution analyses come from a broader range of literature assessed in the WGII AR5. See WGII SPM.A1 for descriptions of the attributed impacts. It is very likely that the number of cold days and nights has decreased and the number of warm days and nights has increased on the global scale. It is likely that the frequency of heat waves has increased in large parts of Europe, Asia and Australia. It is very likely that human influence has contributed to the observed (Figures 5 and 7). The idea in 2009 was to have 28 internationally renowned scientists identifying and quantifying the first set of nine planetary boundaries within which humanity can continue to develop and thrive for generations to come Rockström [28]. Crossing these boundaries could generate abrupt or irreversible environmental changes. Respecting the boundaries reduces the risks to human society of crossing these thresholds. Figure 8 presents the values obtained by scientists of the project as the indicators through which human activity must be compared. There are currently three processes in planet earth that have values beyond which the planet is not sustainable: climate change, rate of biodiversity loss and nitrogen cycle. Although not directly a resource that generates original data and trends for the future, the project marks the areas on environmental sustainability where policies are most needed in order to preserve the biosphere. In a sense the project marks the most likely systems in which public policy is most expected to occur in the immediate future in order to reduce the risk generated by human activities.

social-sciences-assessing-Planetary-boundaries

Figure 7: Most Relevant challenges for assessing Planetary boundaries.

social-sciences-Summary-Boundaries-Planet

Figure 8: Summary of the Boundaries of the Planet.

Indicators for sustainable development: The Bossel report

The International Institute for Sustainable Development published a report entitled “Indicators for Sustainable Development: Theory, Method, Applications” that proposes an indicator set for sustainable development based on an accurate methodology. The Summary for Policymakers IPCC Fifth Assessment Synthesis Report Representative key risks13 for each region, including the potential for risk reduction through adaptation and mitigation, as well as limits to adaptation. Each key risk is assessed as very low, low, medium, high, or very high. Risk levels are presented for three time frames: present, near term (here, for 2030- 2040), and long term here, for 2080-2100. In the near term, projected levels of global mean temperature increase do not diverge substantially across different emission scenarios. For the long term, risk levels are presented for two possible futures (2°C and 4°C global mean temperature increase above pre-industrial levels). For each timeframe, risk levels are indicated for a continuation of current adaptation and assuming high levels of current or future adaptation. Risk levels are not necessarily comparable, especially across regions. Climate change is projected to undermine food security (Figure 9). Due to projected climate change by the mid-21st century and beyond, global marine species redistribution and marine biodiversity reduction in sensitive regions will challenge the sustained provision of fisheries productivity and other ecosystem services high confidence. For wheat, rice, and maize in tropical and temperate regions, climate change without adaptation is projected to negatively impact production for local temperature increases of 2°C or more above late-20th century levels, although individual locations may benefit medium confidence. Global temperature increases of ~4°C or more14 above late-20th century levels, combined with increasing food demand, would pose large risks to food security globally high confidence. Climate change is projected to reduce renewable surface water and groundwater resources in most dry subtropical regions (robust evidence, high agreement), intensifying competition for water among sectors (limited evidence, medium agreement).{2.3.1, 2.3.2} 13 Identification of key risks was based on expert judgment using the following specific criteria: large magnitude, high probability, or irreversibility of impacts; timing of impacts; persistent vulnerability or exposure contributing to risks; or limited potential to reduce risks through adaptation or mitigation. 14 Projected warming averaged over land is larger than global average warming for all RCP scenarios for the period 2081-2100 relative to 1986-2005. For regional projections. Figure 6 posal is summarized at Figure 9. The idea is to assess the dynamics of global sustainability by means of several indicators that capture different states of the system in three subsystems. The states are the following:

social-sciences-systems-different-Oriental-basics

Figure 9: Bossel Report in Different types of systems with different Oriental basics.

Existence

The system is compatible with and able to exist in the normal environmental state.

Effectiveness

The system should, on balance, be effective in its efforts to secure scare resources.

Freedom of action

The system must have the ability to cope in various ways with the challenges posed by environmental variety.

Security

The system must be able to protect itself from the detrimental effects of environmental variability, i.e., variable, fluctuating and unpredictable conditions outside the normal environmental state.

Adaptability

The system should be able to learn, adapt and self-organize to generate more appropriate responses to challenges posed by environmental change.

Coexistence

The system must be able to modify its behaviour to account for behaviour and interests (leaders) of other (actor) systems in its environment.

Reproduction

Self-reproducing (autopoietic) systems must be able to reproduce (either as individuals and/or as populations).Q

Psychological needs

Sentient beings have psychological needs that must be satisfied (Figure 7).

Responsibility

Conscious actors are responsible for their actions and must comply with a normative reference.

The three subsystems are the following:

a) Human

Includes social system, individual development and government

b) Support

Includes infrastructure and economic system

c) Natural

Resources and environment

Contextual data on socioeconomically trends

Data on socioeconomically trends is necessary in order to adapt the models to plausible developments in social and economic factors of individuals.

Education

Individual factors affecting propensity to innovate, based on theories of users as innovators, are necessary in order to contextualize the formal educational level of the population in the target years of the forecast.

Digital connectedness

The most basic and encompassing indicator of digital connectedness come from the World Development Indicators of the World Bank. The indicator measures the per cent of Internet users by 100 people.Based on the data, Figure 10 presents the own elaboration of the time series of the indicator by regions. The figure shows that although Western Europe has most of the countries with over 40% of the individuals connected there are still wide differences across EU countries (Figure 8).

social-sciences-Internet-penetration-across-world

Figure 10: Evolution of the Internet penetration across the world, by regions (defined by the UN).

Levels of trust

The Corruption Perception Index is nowadays the de facto Standard measuring differences of levels of trust by countries. The index generated by Transparency International covers all countries. The index is a combination of observational data (you can go to the country and measure it, or use international agency reports) and surveys to individuals that have constant contact with governmental bureaucracies. In its first editions, it was based only on surveys to individuals. Nowadays it uses up to 13 different data sources, one of which is the original Transparency International survey on bribes. In 2012 a report on the methodological issues of the index was released and since then the last versions have adopted a new methodology that compensates for “eventual errors among sources” hence increasing the reliability of the score Michaela [29]. However, the basis of the combination of different sources is still based on using simple aggregation, based on standardizing the different variables. So no models are used to generate such an index (Figure 9).

Contextual data on political and institutional systems

The widespread adoption of the concept of governance in the last years has led to the development of several initiatives of indices attempting to measure the concept in order to rank countries and provide incentives for policy development. This section reviews the potential and limitations of the most relevant governance indices, paying special attention to the indices that attempt to cover virtually all countries in the world: the Ease of Doing Business Index and the Worldwide Governance Indicators.

Ease of Doing Business Index

The Ease of Doing Business Index (EDBI) attempts to rank countries by business regulations specifically, but it contains several aspects relevant for governance in general (Figure 10). It includes information from 10 topics: starting a business, dealing with construction permits, getting electricity, registering property, getting credit, protecting investors, paying taxes, trading across borders, enforcing contracts and resolving insolvency The raw data comes from surveys administered to “legal practitioners or professionals who regularly undertake the transactions involved” and uses two types of data and indicators: legal indicators and time and motion indicators. The preferred outcome provided is not an index, but a ranking of the countries. A rank is probably good for journalists, but is problematic for systematic treatment, mostly because it does not give any sense of distance between two countries (and assumes a uniform distribution of the distance between them). In addition to using a rank, the EDBI equally averages over the ranks in the topics. In the documentation they admit having considered other methodologies. Specifically, they refer to principal components and unobserved components, but they state that those are discarded on the basis that they “yield a ranking nearly identical to the simple average” World Bank [30].

Worldwide Governance Indicators

The Worldwide Governance Indicators (WGI) from the World Bank comprises not a single measure of Governance, but six different indicators that refer to different aspects of the concept, namely: voice and accountability; political stability and absence of violence/ terrorism; government effectiveness; regulatory quality; rule of law and control of corruption World Bank [31].The raw material is also original data collected specifically for the project through the use of surveys to public, private and NGO sector experts.

The six WGI are quite elaborate in the sense that there is a formal model behind the weighting of the indicators, and they represent the most technically solvent measure of governance so far. An aggregation based on Unobserved Components Model (UCM) is used. UCM starts from the premise that “each of the individual data sources provides an imperfect signal of some deeper underlying notion of governance that is difficult to observe directly” Kaufmann et al. [32]. But UCM is used to technically solve the problem of different scales between variables, not whether each variable is more or less important or more or less aligned with the resulting index. The weights of each variable are considered as a nuisance by UCM jk, where k refers the potential imperfections of the measure and j refers to a country. UCM is a partial improvement, because it considers –although in a very limited way– some notion of uncertainty. Figure 11 presents the 2014 scores on one of the components: the Regulatory Quality. As can be seen, there is not much variation across countries in Europe, which may be problematic if we want to have substantially meaningful differences across governance indices in our project. On the contrary, a nice thing about the WGI is that the six different indicators allow us to refine the two proposed measures of governance. For instance, the top down/bottom up approach proposed would fit better to be measured by the Voice and Accountability indicator, whereas the globalized/localized approach would be more problematic.

social-sciences-Corruption-Perception-European-Union

Figure 11: 2014 edition of the Corruption Perception Index for the European Union and Western Europe.

Bertelsmann Foundation

The Bertelsmann Foundation manages the Sustainable Governance Indicators (SGI) that contain three dimensions: policy performance, democracy and governance. Their operationalization is based on qualitative assessments with an expert network and quantitative data. The indicator tries to be as encompassing as possible, to the point that it includes many variables more than 110 in the three dimension. Despite the fact that with so many variables the focus of the concept may be lost, its major limitation is that it only includes OECD and EU countries (Figure 12).

social-sciences-scores-Regulatory-Quality

Figure 12: Map of the 2014 scores on Regulatory Quality (one of the six components of the WGI indicators).

Measurement indicators

The desirable properties of indicators necessary to forecast user innovation in Europe in 2030/2050 are the following:

a) Robustness

The indicators must be central to the discussion of the domains of energy, living, mobility and food.

b) Independency

The indicators should not be dependent of endogenous variables in user innovation

c) Precision

The indicators must be clear enough in order not to produce discussion about heir meaning.

DECENTRALIZED

The indicators must be able to be collected from different sources, and not be dependent on a single source of data that may have incentives to modify its estimations based on market or political purposes.

Food is the domain most affected by climate change according to the IPCC, and with high confidence on the conclusions: “All aspects of food security are potentially affected by climate change, including food production, access, use, and price stability. For wheat, rice, and maize in tropical and temperate regions, climate change without adaptation is projected to negatively impact production at local temperature increases of 2°C or more above late 20th century levels, although individual locations may benefit (medium confidence). Projected impacts vary across crops and regions and adaptation scenarios, with about 10% of projections for the 2030–2049 period showing yield gains of more than 10%, and about 10% of projections showing yield losses of more than 25%, compared with the late 20th century. Global temperature increases of 4°C or more above late 20th century levels, combined with increasing food demand, would pose large risks to food security, both globally and regionally.” IPCC [33]. As for living, the IPCC states that “Until mid-century, projected climate change will impact human health mainly by exacerbating health problems that already exist (very high confidence). Throughout the 21st century, climate change is expected to lead to increases in ill-health in many regions and especially in developing countries with low income, as compared to a baseline without climate change high confidence [34]. Health impacts include greater likelihood of injury and death due to more intense heat waves and fires, increased risks from foodborne and waterborne diseases, and loss of work capacity and reduced labour productivity in vulnerable population’s high confidence. Risks of under nutrition in poor regions will increase (high confidence). Risks from vectorborne diseases are projected to generally increase with warming, due to the extension of the infection area and season, despite reductions in some areas that become too hot for disease vectors medium confidence. Globally, the magnitude and severity of negative impacts will increasingly outweigh positive impacts high confidence. The list of measurement indicators is the following:

Unemployment in European Union

According to the Bossel report, this is a marker for assessing the effectiveness of human systems. Having an effective human system is an adequate measure to forecast the level of activity of a human population in the future. Higher unemployment is associated with less general activity outside home (mobility), and more activity at home.

Gross world product per person

According to the Bossel report, this is a marker for assessing the effectiveness of the support system. The effectiveness of the support system should provide an accurate forecasting for the necessity of energy in the future. Grosser world product is associated with higher levels of energy.

Grain yield efficiency

According to the Bossel report, this is a marker for assessing the effectiveness of the natural system. The effectiveness of the resources in the environment necessary for system should provide an accurate forecasting for the way in which energy is effectively used.

Share of population age 60 and over

According to the Bassel report, this is a marker for assessing the lack of restrictions in freedom of action of humans. Having such a marker is necessary for overall forecasting of mobility needs in the future. Higher population age 60 is associated with less general activity (mobility).

Energy productivity in industrial nations

According to the Bossel report, this is a marker for assessing the efficiency of energy use in the infrastructure and economic system.

Water use as share of total runoff

According to the Bossel report, this is a marker for assessing the freedom of action provided by natural resources.

Electric vehicle sales

The single most important indicator of the development of mobility in Europe in the following years is the rate of adoption of electric vehicles.

Grain surplus factor

According to the Bossel report, this is a marker for the existence of the human system. Having such a marker provides the tendency of the capacity of humans to produce enough food. Higher graing surplus factor is associated with a better use of energy and higher activity in the food domain.

Debut as share of GDP in developing countries

According to the Bossel report, this is a marker for the existence of the support system. Having such a marker would provide necessary indicators for the capacity of developing countries to support food

Generation. Higher debt in developing countries is associated with higher efforts in developing countries towards primary sectors, which in turn is associated with less necessity in Europe.

World fish catch

According to the Bossel report, this is a marker for the existence of the natural system. Having such a marker is necessary for foracasting the overall availability of food resources.

DISCUSSION AND CONCLUSIONS

The proposed strategy for forecasting the future in user innovation in the domains of energy, mobility, food and living when innovations are sustainable is builded in the combination of several elements presented in the document. First, a combination of forecasting models may be used. While the flexibility of agent-based models and its convenience for explaining interactions within systems is convenient, it must be combined with simple time-series based odels for series of data from which forecasts are available and have a low level of uncertainty (i.e. increase in temperatures). The inclusion of the Delphi method may also help in providing quantifiable values for aggregated social behavior. Second, the combination of purely environmental data sources such as the International Panel on Climate Change with selected variables from the Bossel report is adequate for setting the limits of the contextual factors of the model. These sources can be also combined with contextual data on political and institutional systems, with special emphasis on facility for policy change and overall governance.

Conflicts of interest

The author declares that there are no conflicts of interest.

Acknowledgements

This research is part the large-scale project “Sustainable Lifestyles, 2.0: End User Integration, Innovation and Entrepreneurship (EU-Innovate)”. The project has received funding from the European Union’s Seventh Framework Programme for research, Technological development and demonstration under grant agreement no 613194.

References