More

Sourcing Historical Wind data?

Sourcing Historical Wind data?


I'm looking for a service that provides hourly historical data for 24h and back. I am currently interfacing openweathermap.org but had to find out that not all stations provide this feature.

Are there any other free services that provide this data?


You can try finding a station on Wunderground that fits your needs and then use their API to get what you want.


Geographic Information Systems (GIS): GIS Data

For a more extensive list of federal data sources, see: Maps and Mapping Resources from Government Agencies.

    . Best place to start for U.S. Government data sets . Open depository with data contributed by organizations including government agencies, nonprofit groups, and academic institutions. . Downloadable US Census data tables and mapping files from 1790 to present. . (FEMA) GIS atlas of hazards and potential disasters in the U.S. Data sets and maps available for download. (U.S. Federal Geographic Data Committee) (Earth Resources Observation Systems) (NCEI). Climate and weather, and geophysical data (formerly NCDC and National Geophysical Data Center). (USDA) . Find data for topics including biology, coasts, energy, environmental health, geology, land resources, minerals, natural hazards, and more.

Looking for ToxMap? This resource was retired by the United States government in December 2019. Check out Mapping Waste from One Region Forward. This site includes maps of waste sites (hazardous, radioactive, and solid waste), resource extraction sites and air pollution emitters in Erie, Niagara and Chautauqua counties.


Sourcing Historical Wind data? - Geographic Information Systems

USGS is a primary source of geographic information system (GIS) data. Our data and information is presented in spatial and geographic formats, including The National Map, Earth Explorer, GloVIS, LandsatLook, and much more.

Hydrologic Unit Maps

The U.S. is sub-divided into successively smaller hydrologic units which are classified into four levels: regions, sub-regions, accounting units, and cataloging units. Each unit is identified by a unique hydrologic unit code (HUC) consisting of two to eight digits based on its classification. This site provides information and data for current and historical hydrologic units, names, and numbers.

Hydrogeology of the Adelaida Area, San Luis Obispo County, CA

The USGS is conducting a comprehensive evaluation of groundwater resources of the Adelaida area. Use this map to explore the hydrogeology of the area, including land use, geology, and USGS hydrologic data by watershed or water management district.

USGS Hydrologic Conditions Network for New York

Hydrologic Conditions Network Map displays Streamflow Monitoring Network, Groundwater Bedrock Aquifer Monitoring Wells, and Groundwater Unconsolidated Aquifer Monitoring Wells in New York State Drought Regions.

GIS shapefile: Sarasota County, Florida irrigated agricultural land-use for the 2018 growing season

This data set consists of a detailed digital map of individual irrigated fields and a summary of the irrigated acreage for the 2018 growing season developed for Sarasota County, Florida. Selected attribute data that include crop type, irrigation system, and primary water source were collected for each irrigated field.

GIS shapefile: Citrus, Hernando, Pasco, and Sumter Counties, Florida irrigated agricultural land-use from January through December 2019

This data set consists of a detailed digital map of the extent of fields and a summary of the irrigated acreage for the period between January and December 2019 compiled for Citrus, Hernando, Pasco, and Sumter Counties, Florida. Attributes for each field include a general or specific crop type, irrigation system, and primary water source.

Public-Supply Well Water Quality Results: Inorganic Data and Trends, 1974 - 2014 (California GAMA-PBP)

The GAMA-PBP Public-Supply Well Results data viewer allows the user to visualize and download California water-quality data and trends for1974 - 2014. Groundwater-quality data for 38 inorganic constituents are captured and can be downloaded for individual sites or by grid cell.

CoSMoS Implementation versions

The Coastal Storm Modeling System (CoSMoS) makes detailed predictions (meter-scale) over large geographic scales (100s of kilometers) of storm-induced coastal flooding and erosion for both current and future SLR scenarios, as well as long-term shoreline change and cliff retreat. Several versions of CoSMoS have been implemented for areas of the California coast.


Geographic Information Science and Technology Master’s Program Online

The online M.S. in Geographic Information Science and Technology (GIST) equips students with an understanding of how spatial data shapes decisions in the modern world and provides hands-on experience with leading-edge GIS tools.

Designed and updated by GIS experts from diverse backgrounds, the program offers both the foundational and specialized knowledge necessary for sourcing spatial data and applying sophisticated spatial analysis to virtually any industry or academic domain.

USC M.S. in GIST graduates come from all varieties of professional backgrounds, but they tend to share scientific curiosity and an interest in spatial thinking. They are lifelong learners who want to use their knowledge to solve problems within their respective fields.

“USC and the GIST program at the Spatial Sciences Institute have changed my life. The coursework was intellectually stimulating and challenging, and gave me the flexibility and skill set required to pursue a real-world thesis project.”

– Trevor Denson,
M.S. in GIST, Class of 2017,
Wilderness Ranger,
National Park Service

Request Brochure

Fill out the information below to learn more about the University of Southern California’s online GIS Graduate Programs and download a free brochure . If you have any additional questions, please call 877-650-9054 to speak to an enrollment advisor.

The University of Southern California respects your right to privacy. By submitting this form, you consent to receive emails and calls from a representative of the University of Southern California, which may include the use of automated technology. Consent is needed to contact you, but is not a requirement to register or enroll.

M.S. in GIST Career Opportunities

M.S. in GIST professionals hold a unique position within the overall career market: they have specialized expertise, yet the application of their knowledge is limitless in scope. Government organizations hire GIS professionals for roles ranging from disaster planning and preparedness to community development. The military leverages GIS in surveillance operations and threat response. Businesses in many different industries utilize GIS expertise across supply chain logistics, marketing and countless other industry domains.

USC’s online M.S. in GIST degree program is tailored for individuals who want to develop deep GIS expertise, those interested in senior-level GIS roles, or current GIS professionals who wish to pursue leadership positions.

Through the robust coursework, consulting with their thesis advisors and connecting with the GIS community, many of our GIST master’s students find career opportunities and advancement before they finish the program.

“While still completing my thesis, I landed my dream job working at Orca Maritime Inc. in Imperial Beach, California. My job came about through networking at a local Geospatial LinkedIn event which showcased specialized GIS engineering research projects.”

-Nate Novak
M.S. in GIST, Class of 2016
Geospatial Technologies Manager at Orca Maritime, Inc.

Curriculum Summary

The program consists of 28 total units, including 20 units of foundational coursework and a choice of electives. This allows students to complete the program in as few as 20 months.

The core courses provide a comprehensive understanding of spatial thinking, data acquisition and the characteristics of spatial information, as well as providing a capstone thesis experience. By selecting a track, students can develop specialized expertise designed to support their career path.

We have identified three tracks that enable students to build robust knowledge and capabilities to support them in their career trajectories. Each track offers a choice of electives that have been especially curated to provide students with the opportunity to customize their course plan.

M.S. in GIST Program Tracks

  • Track 1: Geospatial Information Management
  • Track 2: Geospatial Programming and Software Development
  • Track 3: Geospatial Analytics

Visit the curriculum page for detailed information on each of the tracks.

GIS Fieldwork on Catalina Island

Our M.S. in HSGI, M.S. in GIST and graduate certificate in GIST programs will take students on a weeklong fieldwork excursion to the Wrigley Institute for Environmental Studies — a half-acre research and educational campus on Catalina Island.

Veteran Funding

The Dornsife Spatial Sciences Institute proudly offers a variety of funding options for Active Duty and Veteran service members enrolled in online GIS graduate programs.


Literature review

In recent decades, countries and societies have been threatened by terrorism, cyber attacks and warfare. Government departments in charge of defence and national security are, therefore, searching for techniques to improve their data sets, and hence their ability to respond to or even prevent these activities, with up-to-date or even real-time information to obtain a better and faster understanding of critical situations. It is believed that crowdsourcing can be a key to improve defence decision-making (Greengard 2011 Parsons 2011 Franklin et al. 2013 Shanley et al. 2013).

This section presents an overview of previous work linking crowdsourcing, VGI and georeferenced social media data with defence cases, organised around some of the tasks relevant to UK defence (see Sect. 2). In particular, previous work highlights that VGI and open data can prove effective in civil emergency cases and increase the understanding of new environments and the strategic intelligence of the operations. Additionally, the concept of data quality is analysed, and previous work to assess data quality of crowdsourced data is summarised in order to underpin the subsequent assessment of how the fitness-for-purpose of crowdsourced data can be assessed in a defence context.

Clarification of crowdsourcing terminology

Crowdsourcing, as has been previously mentioned (see Sect. 1), is an open participative activity, where anyone can propose a specific task of data collection to a number of people while Goodchild (2007) introduced VGI as an “umbrella” term for the geographic information being created by everyday users, privately and voluntarily. More broadly, from a sociological perspective, collaborative and social computing (social media) (Roy et al. 2017) can rapidly provide up-to-date information, which in some cases is also geolocated. The real-time data produced by using social media platforms such as Twitter and Flickr can provide georeferenced data captured by individuals, although this is not specifically “volunteered” (See et al. 2016). Therefore, the data acquisition process to cover this type of VGI has been defined by Fischer as “involuntary geographic information” (iVGI) and can be also used for various activities (Fischer 2012). Both VGI and iVGI are considered in this paper.

As noted above, the availability of georeferenced data through VGI and iVGI services and platforms increases interest in this field for defence domains. Many authors in defence use crowdsourcing as the chosen term to describe the phenomenon of voluntary and involuntary citizen participation, and this paper follows the convention used in the defence context. We thus use the term “Crowdsourced Geographic Information” (See et al. 2016) to include both VGI and iVGI. As Fig. 1 shows, crowdsourcing is the main term used to characterise both spatial and aspatial data produced by passive and active users in participation activities, while the spatial data produced by active users or volunteers are referred to as VGI while the data produced by social media users (or passive users) are referred to as iVGI.

Crowdsourced information can contain georeferenced and non-georeferenced information. This georeferenced information is referred to as crowdsourced geographic information and contains spatial crowdsourced information that is produced either by volunteers (VGI) or by social media or other users such as satellite navigation systems (iVGI). This figure has been compiled by the authors based on the review in Sect. 3.1

Crowdsourcing in defence

An extensive review has highlighted the fact that there is relatively little literature linking crowdsourcing and defence, perhaps due both to the relative novelty of crowdsourced data and to the fact that much defence research is, of necessity, confidential. The published literature has focussed on strategies to improve strategic intelligence, defend national territory, and the potential for crowdsourcing to provide additional information in times of crisis or disaster.

Crowdsourcing in defence and warfare

Many Western departments of defence have national defence and warfare preparedness as their primary objective. For instance, the first UK MOD task refers to the defence and contribution of the security and the resilience of the UK and overseas territories (see Sect. 2). War-fighting preparedness is also the first mission of the United States (US) Department of Defense (DoD), which has long understood the importance of GIS and highly accurate data for use during warfare operations (Franklin et al. 2013). Franklin et al. (2013) noted that data are now frequently co-produced, involving non-traditional stakeholders (non-DoD GIS professionals, Information Technology professionals and daily GIS users), and three partner types: clients, citizens and volunteers. They also explore the use of volunteered information for military operations and noted that these multiple producers have resulted in the US DoD increasingly facing challenges relating to data integrity and data security.

To build on this work, the US Defense Advanced Research Project Agency (DARPA) investigated various strategies by which they can improve their intelligence information. The importance of crowdsourcing in this context was examined via a funded challenge, i.e. the “DARPA Network Challenge”, during which solutions and techniques were proposed by groups and individuals, and crowdsourcing issues were examined (Greengard 2011 Hui 2015). The prize was either money or recruitment possibilities. The results showed that the intelligence of the agency can be improved by many groups, especially university teams but also individual experts. Motivational issues have also been investigated by DARPA during the “Red Balloon Challenge” (Hui 2015), which offered money as a reward for location corrections. The winning team solved the challenge by using crowdsourcing through social media.

Crowdsourcing has also been used to support warfare in an indirect manner. The US State Department organised a challenge to look for ideas of how crowdsourcing can support arms control transparency, giving as a prize, an amount of money (Hui 2015). The winner, by using visible light communications, improved arms control inspections.

Crowdsourcing for homeland security

Addressing challenges closer to home than traditional warfare, the Texan government in the USA has created the “Texas Virtual BorderWatch” (Tewksbury 2012 Hui 2015). As Texas is on the border with Mexico, numerous migrants try to cross the border every day, without authorisation. Due to illegal immigration and drug smuggling issues, additional techniques to improve homeland security have been investigated (Tewksbury 2012). A network of web-based surveillance cameras was created, and the people, transformed to “interactive citizen-soldiers” (Andrejevic 2006), reported any suspicious activities in inaccessible zones such as in rivers or in wooded areas. Although the immobility of the cameras was identified as one of the technical issues during this process, this was an effort where the awareness against criminality was improved by using crowdsourcing techniques (Tewksbury 2012). In a similar move, the US Department of Homeland Security (DHS) created the Neighbourhood Network Watch program to collect reports of suspicious online criminal activities (Hui 2015).

Addressing another homeland security issue, i.e. civilian unrest in 2011 in England, where thousands of people rioted as a result of a citizen’s fatal shooting by police, the authorities used a combination of social media platforms (Flickr, Facewatch ID and Twitter) to recognise the offenders’ faces (Suleyman 2017). Thousands of photographs were shared via Flickr and sorted according to postal codes, and the authorities were informed of any recognised rioter’s face via Facewatch ID. As a final point, a hashtag was used by authorities to collect all the needed information regarding the looters.

Use of social media to increase awareness and assist short-term predictions in a crisis

It is important during military operations, where the environment is usually aggressive and dangerous, to decrease the level of unknowns, and with the diffusion of terrorism and the expansion of military operations around the world, the analysis of crowdsourced data, especially social media data, are often the only easily available source of information. Thus, social media can reduce the level of uncertainty in new environments and also increase knowledge within areas of command responsibility (Mayfield 2011). As a consequence, efficient use of social media data may lead to a better understanding during crisis situations, in collaboration with allies, partners and multilateral institutions (see Sect. 2) for peacekeeping purposes.

Social media can improve situational awareness by examining not only the environment as a whole, but also specific target users, and a number of commercial tools have been developed to support this task. Rapid Information Overlay Technology is an extreme-scale analytics system with defence objectives and can “spy” on a user’s activities and habits (Gallagher 2013). The data can be mined from the most popular social media platforms such as Facebook and Twitter, and statistics of the user’s daily habits can be collected and predicted. Another example is Wikistrat, which is an analytical services consultancy that makes use of crowdsourcing to improve client awareness, and has provided predictions for a military operation for US Africa Command and a prediction of activities of the Islamic State (IS) in Middle East by analysing social media data (Hui 2015).

Having as an aim to improve the accuracy of short-term and middle-term events, the Aggregative Contingent Estimation System (ACES) was a predictive tool that used the answers of participants, who were asked questions relating to several scenarios, to assess the opinions of the crowd (Parsons 2011). Some of the scenarios investigated included the decision of the US military to repeal “don’t ask don’t tell” regulations (with regard to LGBT personnel) and the resignation of Yemen’s president Ali Abdullah Saleh. The ACES tool, with the assistance of 1800 participants, correctly predicted both results.

Real versus fake news

In 2006, Hezbollah forces posted a number of videos and photographs through social media to “promote” the war against Israeli forces (Mayfield 2011). Using this technique, the Islamist militant group and political party of Lebanon was able to influence the national troops and citizens and managed to win an unexpected and difficult battle. Moreover, in 2009, an innocent woman was shot during the protest gatherings of the Iranian presidential election, and through social media, several versions of the video that captured the incident, some of them fake, were posted online, creating a wave of global reactions and a negative climate that took the Iranian government weeks to control. These two examples demonstrate that social media can be used for both positive and negative manipulation of the crowd during warfare or a national crisis. Social media is open to all and can include terrorist websites, which could be used by extremists and terrorists (Hope and McCann 2017). Crowdsourced information, and especially social media, does not always have a positive impact because criminal activities can exploit it for their purposes (Johnson 2014).

Taking advantage of the effect that social media can have on the crowd, terrorist organisations try to promote their objectives and manipulate social media users by spreading fake news via well-known platforms. According to the RAND Corporation, after analysing 23 million tweets posted in Arabic by 771,327 Twitter users, it was discovered that ISIS supporters produced approximately 50% more Tweets per day than their opponents in the period July 2014 to April 2015 (Bodine-Baron et al. 2016). This is one of the first examples of a terrorist organisation that has successfully used a social media platform to promote its message, stimulate and recruit new fighters and spread its propaganda. This has driven the US DoD to explore new techniques to decrease ISIS influence, and Twitter to continue its campaign of account suspensions. The research showed that the number of supporters was reduced as a result of Twitter’s account suspensions.

More recently, additional attention has been paid to the growing amount of fake news in the media. For example, the recent story of the girl trapped in the catastrophic earthquake in Mexico, which was based on the evidence of a witness, and where the incident was reproduced multiple times in social media. This resulted in the Mexican aid forces spending hours of efforts on the rescue before realising that they did not have any valid proof that the incident was real (Associated Press 2017).

Crowdsourcing and cyber attacks

Cyber space is a rather new field where crowdsourcing is also relevant (Hui 2015). Defending and securing cyber space is one of the UK MOD’s missions (see Sect. 2). Johnson (2014) has explored a number of cyber attacks (Estonia-2007, Belarus-2008, Lithuania-2008, Georgia-2008, China-2009, W32.STUXNET-2010) where the criminals shared malware through social media and networks, investigating the threats to human–machine interaction in multi-layered networks. After explaining the difficulties that commercial and government organisations have in predicting the potential groups that take part in cyber attacks, he identified four ways in which social media could be linked to cyber attacks: social networks motivate participation through crowdsourcing social networks are used to target individuals via phishing “disposable service models” associated with social networks aid coordination of an attack use of anti-social networks supports botnets and associated criminal infrastructure) where social networking can block the attribution of cyber attacks.

Research regarding the advantages of crowdsourcing for cyber security was completed by the US DHS and the Center for Risk and Economic Analysis of Terrorism Events (CREATE) (Hui 2015). It appeared that encouraging individuals and institutions to voluntarily cooperate to secure cyber space works effectively, and an example of a group of volunteer experts from various institutes and the China’s Ministry of Information Industry is presented. By combining simple techniques, a challenging computer virus named “Conficker” was overcome and its creator identified (Hui 2015).

Crowdsourcing to support civil disasters and emergencies

Another field where crowdsourcing has already proved useful for military operations is humanitarian aid and disaster relief (Roy et al. 2017). Supporting humanitarian assistance and disaster response and conducting rescue missions in times of crisis falls under UK defence policy (see Sect. 2). When traditional methods and reference data are not available, when conditions are uncertain, or there is an excessive delay until a data collection exercise can be initiated, then the required information can be collected via crowdsourcing and social media. Indeed, crowdsourcing is an efficient way to collect near real-time data in disaster response (Ortmann et al. 2011), especially when the population is vulnerable and crisis mapping is required (Shanley et al. 2013). Related terms such as “digital volunteerism” (Shanley et al. 2013) and VGI (Goodchild 2007 Filho 2013) are used in this context, highlighting the role that volunteers can play in improving situational awareness for rapid response to natural disasters and complex humanitarian emergencies (Shanley et al. 2013).

Use of VGI and iVGI in disaster response

Mills and Chen (2009)explain the benefits of using Twitter during emergency situations including low cost, ease of use, scalability, rapidity and the use of visualisation instead of more traditional methods such as messaging or other platforms such as Facebook. The wide range of Twitter use in disaster emergencies (fires, ice storms, earthquakes, hurricanes, cyclones) makes it a low-cost alternative for collecting geolocated information easily and rapidly. An alternative idea for implementing crowdsourced data in disaster response is the “Tweak the Tweet”, where the platform asks users to create new hashtags during a disaster, making them more machine-readable (Finin et al. 2010).

Shanley et al. (2013) have presented examples of where volunteerism was used in disaster management and emergency response conditions, citing occasions where US organisations including the US Geological Survey and the US Federal Emergency Management Agency, used VGI when it was almost infeasible to collect data from authoritative sources. Different techniques were created by projects such as “Do-It-Yourself”, “Did you feel it” and “Civil Air Patrol volunteers”, where non-expert users collected information for natural disasters (earthquakes and hurricanes) via open-source map platforms such as OSM or via cameras (attached to balloons, kites and unmanned aerial vehicles—UAV) when there was no possibility to approach the area (Parsons 2011 Shanley et al. 2013).

Becker and Bendett (2015) also presented several examples (Camp Roberts, DCMO challenge, STAR-TIDES Network) of how crowdsourcing can be implemented in disaster response cases. The “TIDES” program, implemented by the National Defense University Center for Technology and National Security Policy and the US DoD, investigates the use of open-source knowledge to assist in cases such as disaster response where the population is under crisis (Becker and Bendett 2015), aiming to provide decision makers with the necessary knowledge. Providing end-users with needed information can be beneficial in civil emergency because human lives are under threat, and rapid and effective decisions need to be taken. The authors also note that VGI can provide close to real-time information, citing, in particular, the example of the Haitian earthquake in 2010, when 640 volunteers in the Humanitarian OSM Team (HOT) collected urgently needed cartographic information inexpensively and very rapidly.

It is essential that crowdsourced information can be collected as quickly as possible for disaster response. The research project “Evolution of Emergency Copernicus services” (E2mC) has created a new component called “Witness” for the Copernicus Emergency Management Service (EMS), which can reduce the time needed to integrate crowdsourced data after a disaster event (Havas et al. 2017). Witness architecture includes data acquisition, storage, management and analysis, and graphical user interface components, which have been used in several recent disaster cases such as in the Central Italy earthquake (2016) and the Haiti hurricane (2016).

Data quality and crowdsourcing

From a producer’s perspective, data quality refers to how well the data produced, for a specific purpose, conform to a representation or abstraction of the real world (Devillers and Jeansoulin 2006 Harding 2006). However, the end-user’s perspective differs, and as crowdsourced data can be open data, they can end up being used for purposes beyond which they were originally collected for. Even so, it is important that the producer can understand as far as possible the end-user’s needs (Harding 2006), and in order for any data set to be used appropriately, the user needs to have sufficient understanding of its quality, i.e. how “fit for purpose” the data are. This is particularly the case in defence, where decisions made both in warfare and in disaster management can directly impact human lives. Initial discussions with respect to the need to describe the data quality of spatial data first started towards the end of the 1980s (Goodchild and Gopal 1989), focussing specifically on the accuracy of the data (Chrisman 2006), with accuracy being split into geometric accuracy and semantic accuracy. When the need to measure the quality increased due to an increasing variety of data sources and data sets, it was realised that accuracy only forms one component of quality, and that issues such as currency, completeness and others are also relevant. A key challenge thus relates to the need to find a concept that can represent data quality overall.

General approaches to measuring data quality

Standards provide specific information about the quality of a data set, and this information is derived by measuring the quality of the data itself. Quality can be categorised as internal and external quality. Internal quality refers to the level of the similarity between the data that have been produced and that the data should have been produced, referred to as “perfect” data, while external quality refers to the level of agreement between the produced data and the user needs. External quality has more recently, and more correctly, been described as “fitness for use”, “fitness for purpose” or “fitness for use for a certain purpose” (Devillers and Jeansoulin 2006 Dorn et al. 2015 Fonte et al. 2015).

Data quality assessments can also be extrinsic or intrinsic. When the measurement of a data set is assessed in comparison with the previously mentioned authoritative or reference data, the approach is extrinsic (Girres and Touya 2010 Haklay 2010 Helbich et al. 2012 Zielstra and Hochmair 2013 Barron et al. 2014). To evaluate quality using an intrinsic approach, only one data set is needed (Barron et al. 2014).

Measuring the quality of crowdsourced geographic information

As noted in Sect. 1, the technological evolution and low-cost tools enable public participation in the data capture process. However, as end-users of the resulting data are not aware of the accuracy level of the devices and/or of the expertise or the training level of the volunteers, the necessity for quality evaluation prior to use becomes even more important (Ali and Schmid 2014). Both intrinsic and extrinsic measures can be applied to crowdsourced geographic data, depending on the availability of “ground truth” data. While geometric evaluation is important, semantic inconsistency is also a major issue in VGI (Ali and Schmid 2014), e.g. a road in OSM can be marked as “key: highway”, referring to one of the most important OSM road tags (Pourabdollah et al. 2013). However, there are other OSM values that can clarify the type of road such as “motorway”, showing that the highway key is not always relevant.

Basing the analysis on available quality standards, a number of studies have compared VGI data sets with authoritative data sets (Ali and Schmid 2014 Antoniou and Skopeliti 2015). These assume that available authoritative data can be characterised as reference data, which means that the quality level of the authoritative data sets is high (Antoniou and Skopeliti 2015) and such data sets are assumed to be “correct”.

Barron et al. (2014) have created a framework to evaluate VGI quality intrinsically by presenting various methods and indicators. Additional examples include Dorn, Törnros, and Zipf (2015), who present a new extrinsic land use quality approach Barron, Neis and Zielstra (2014) who compared various VGI projects and platforms for measuring quality and presented a number of OSM trends Fonte et al. (2015) who combined existing quality assessment methods for validating land cover maps and Girres and Touya (2010) who evaluated the OSM quality intrinsically and extrinsically for a number of elements and indicators at a national level.

However, due to licensing issues and restrictions related to some authoritative data sets, it is not always possible to evaluate the VGI quality by comparison with authoritative sources (Barron et al. 2014 Antoniou and Skopeliti 2015). Therefore, alternative evaluation methods have been proposed using intrinsic analysis, with the new measures described as quality indicators (Antoniou and Skopeliti 2015). A list of these indicators was presented in Fonte et al. (2017), who also proposed indicators specific to VGI, expressing the view that either existing standards need to be updated and take into account new sources and reports, or new standards need to be developed. In addition, Degrossi et al. (2017) summarised 13 quality methods in a systematic literature review, which can be used for crowdsourced geographic sources when authoritative ones are not available. However, some of these are not suitable for social media, and the necessity for further research is indicated. Likewise, Senaratne et al. (2017) outlined 30 methods that can be used to assess the VGI quality of maps, images and text, proposing data mining as an autonomous approach for estimating VGI quality.

Standard approaches to documenting and communicating spatial data quality

Once assessed, the quality of a spatial data set can be documented in many ways, from an online webpage to a PDF document, with the main objective being to increase the level of confidence in the spatial data products. To facilitate interoperability and ease of comparison of data sets, a number of standardised metadata structures have also been created (Antoniou and Skopeliti 2015 Dorn et al. 2015). One of the most well-known standards organisations is the International Organization for Standardisation (ISO), and several principles and guidelines have been proposed by the ISO to assess data quality (Barron et al. 2014 Antoniou and Skopeliti 2015). ISO 19113 and ISO 19114 describe quality principles for geographic information (Van Exel et al. 2010 Barron et al. 2014 Neis and Zielstra 2014 Dorn et al. 2015). ISO 19157 provides a more updated approach and replaces the previous versions (Barron et al. 2014 Neis and Zielstra 2014 Antoniou and Skopeliti 2015). It defines a list of six quality elements: completeness, logical consistency, positional accuracy, temporal quality, thematic accuracy and usability (Barron et al. 2014 Antoniou and Skopeliti 2015). The first five elements are focused on the producer (internal quality), while the last one (usability) is focused on user needs (external quality) (Fonte et al. 2017). To measure data quality, ISO’s principles and assessments assume that geographic data can be compared authoritative or reference data.


Ⓘ List of GIS data sources. This is a list of GIS data sources that provide information sets that can be used in geographic information systems and spatial databa ..

This is a list of GIS data sources that provide information sets that can be used in geographic information systems and spatial databases for purposes of geospatial analysis and cartographic mapping. This list categorizes the sources of interest.

  • GIS software encompasses a broad range of applications which involve the use of a combination of digital maps and georeferenced data GIS software can
  • Information System GIS is a system designed to capture, store, manipulate, analyze, manage, and present spatial or geographic data GIS applications are
  • capabilities for data manipulation, editing, and analysis. Arc GIS Pro, is the new application of ESRI, will replace ArcMap Arc GIS Pro works in 2D and
  • data collection. In the past, GIS was not a practical source of analysis due to the difficulty in obtaining spatial data on habitats or organisms in underwater
  • centrally, but by the data originator and or owner, and that tools and services connect via computer networks to the various sources A GIS is often the platform
  • allow use of data from external sources QGIS integrates with other open - source GIS packages, including Post GIS GRASS GIS and MapServer. Plugins written
  • of the website A Vision of Britain through Time. NB: A GIS is a geographic information system, which combines map information with statistical data
  • GIS Live DVD is a type of the thematic Live CD containing GIS RS applications and related tutorials, and sample data sets. The general sense of a GIS
  • is a comparison of notable GIS software. To be included on this list the software must have a linked existing article. Open Source Geospatial Foundation
  • Coverage closed, hybrid vector data storage strategy. Legacy Arc GIS Workstation ArcInfo format with reduced support in Arc GIS Desktop lineup. by ESRI File
  • extensive capabilities of data exchange and also in the commercial GIS community due to its widespread use and comprehensive set of functionalities. Several
  • between GIS application and sources for manipulating, defining and analyzing geospatial data GDAL OGR Library between GIS application and sources for
  • Distributed GIS permits a shared services model, including data fusion or mashups based on Open Geospatial Consortium OGC web services. Distributed GIS technology
  • on those web services and API Georeference List of GIS data sources Fu, P., and J. Sun. 2010. Web GIS Principles and Applications. ESRI Press. Redlands
  • also written as historical GIS or HGIS is a geographic information system that may display, store and analyze data of past geographies and track changes
  • Data fusion is the process of integrating multiple data sources to produce more consistent, accurate, and useful information than that provided by any
  • system GIS software package developed by Manifold Software Limited that runs on Microsoft Windows. Manifold System handles both vector and raster data includes
  • Data exchange is the process of taking data structured under a source schema and transforming it into data structured under a target schema, so that the
  • Web GIS emphasizes geodata processing aspects more involved with design aspects such as data acquisition and server software architecture such as data storage
  • data collections are currently available for online preview, and most can be downloaded free of charge. Other features include national profiles, GIS
  • spectrum of sources - including data centers, scientific and technical journals, and geographic information systems GIS as well as the websites of government
  • data while GeoNetwork catalogs collections of such data Free and open - source software portal Comparison of GIS software List of GIS software List of
  • in GIS Data Modeling 2005 Ben Franklin Award winner, PMA, The Independent Book Publishers Association. Spatial Databases With Application to GIS Philippe
  • languages data the DATA project, designed to digitise data for the Dena ina language the LL - MAP project defunct designed to produce a comprehensive GIS site
  • component of Esri s Arc GIS suite of geospatial processing programs, and is used primarily to view, edit, create, and analyze geospatial data ArcMap allows
  • implementation of open standards for geospatial content and services, sensor web and Internet of Things, GIS data processing and data sharing. A predecessor
  • of Pune, India s premier institute for graduate and post graduate courses. The courses offered for GIS and RS here are held under the Department of Geography
  • MapInfo GIS products. Global Mapper handles both vector, raster, and elevation data and provides viewing, conversion, and other general GIS features
  • databases and their data and a simple desktop GIS tool for browsing data As it is a single binary file, SpatiaLite is also used as a GIS vector format to
  • system GIS and remote sensing software for both vector and raster processing. Its features include digitizing, editing, analysis and display of data and

Share:

Publication Date:

Article Source:

Users also searched:

gis - data украина, data, List, украина, sources, free, List of GIS data sources, gis - data украина, free gis data, list of gis data sources, geographic data and information. list of gis data sources,

Top 15 Free GIS Data Sources you will ever need Monde Geospatial.

Strategies and sources on collecting geographic data and metadata. GIS Manual Sources of Spatial Data. Note:This list is by no means comprehensive. If you do not see your favorite data source is this list, email me and let me know! Cal Atlas Geospatial. GIS Data The Beginners Guide to GIS Mango MangoMap. Get GIS Data. Visitors can use their web browser to search for and download frequently requested DNR geospatial data on the DNR GIS Open.

GIS Data and Tools USGS.

This guide covers the basic GIS data types, vector vs. raster data, LiDAR, and our For a full list of GIS file types, check out one of the resources at the bottom of. Listing of Free GIS Data Sources from Robin Wilson Spatial Reserves. When Looking For Data Keep in mind key questions to help you frame your research. Which subject areas are you looking to find data in?. Internet Data Sources for Foreign Countries GIS Data Research. These datasets serve as base data for a variety of Geographic Information System GIS applications that support research, business and DEMs are made available in: raw source ascii files. Complete list of other GIS data providers.

Finding GIS data sources for developing nations? Geographic.

Data Source: The source data and metadata are available for download through the streams, lakes and wetlands contained in the 2012 impaired waters list. Developing Internet based GIS Applications to Manage and EPA. The GIS Lab is committed to provide GIS data for the state of Illinois and the U.S. to Illinois County GIS Links: This site provides a list of counties and links to each Illinois Natural History Survey Data Resources: Data provided through the. Geospatial Data by Geographic Region Geospatial and Statistical. GIS Data Sources. Data Sources Legend. Interactive Maps Tabular Data Downloadable Spatial Datasets Data as a Web Service. Sources of Spatial Data. GIS and Remote Sensing Data Sources The National Map and The National Map Viewer WebServices: The National Map Services List The National Map.

List of GIS Data Sources Help.

The list is the primary source for updates on software and license server issues, data acquisitions and workshop announcements. All GIS users are encourages. GIS Data Project Geospatial Articles Project Geospatial. This chapter reviews the main methods of GIS data capture and transfer and Understand the principles of data transfer, sources of digital geographic data. GIS Datasets for the State of Illinois and the United States. This Google doc is a living document that lists freely available geospatial data sources for countries and cities. If you find a source that you. Data Resources GIS Geographic Information Systems Library. The best way to browse for data is to look through a list of data portals that may offer data [email protected] ​.

Online GIS Resources SERC Carleton.

While the names and capabilities of online geospatial tools will likely change The main categories of such data sources are as follows: The utility of spatial data in GIS depends directly on the quality of the spatial data used in the process. GIS Data GIS Lounge. This page provides a comprehensive list of open data portals that contain a Texas Natural Resources Information System TNRIS, No GIS. Lists of Geoportals GIS & Geospatial Data Services LibGuide. This discussion concerns geospatial resources, raster images, vector images, and datasets representing For a useful overview, see GIS Data Types: Vector vs. All of the terms in the following list bear on georeferencing.

GIS Data University of Minnesota Libraries.

The page also links to a number of GIS job and internship resources. GIS data for: elevation hydrography watersheds geographic names. Geospatial Data Resources – International – Robinson Map Library. These include physical map collections, GIS resources and tools and list to look for sources of datasets and links to how to access the data. GIS Data Websites Geospatial Data Resources Guide Library. GIS Geographic Information Systems GIS Data Sources The first link on the page includes a large list of various GIS data sets regarding.

Resource Center Sources for Geographic Data Hamilton College.

With respect to globally consistent vector data sources, in comparison with the Digital Interim Geographic Names Data, Series: GAZGN, Item: DIGNAMES,. Data County of Lexington County SC.gov. For that, we have prepared a list of highly valuable GIS data sources providing freely available geographic datasets – all ready for loading into. Download data by country DIVA GIS. And Applications Center SEDAC.

GIS Data Download Sites NC State University Libraries.

GIS Geographic Information Systems Data Resources From local to international, to the government to UNR subscribed data list of datasets. This is by no. Geographic Information Systems GIS Online Resources The New. I recommend checking the web site which contains a categorised list of links to over 300 sites providing freely available.

GIS Tools, Software and Recommended Base data IUCN Red List.

We had previously published an article related to open GIS data sources around the internet that were free to use. We wanted to update that list to expose even. Geospatial Data an overview ScienceDirect Topics. Resources to discover GIS data, learn about GIS, and provide This is not a comprehensive list however, these sites have been seen as. [email protected] International GIS Data Sources. For a listing of these maps, please go to our GIS Maps page. O P Q R S T U V W X Y Z. See also: data resources from other states, agencies, and universities.

Introduction to Geospatial Resources and Formats.

GeoBase: Freely available GIS data overseen by the Canadian Council on Geomatics. Includes administrative boundaries, geographical names, elevation,. GIS Data Sources Minnesota DNR MN DNR. This is the launch page for finding information about GIS data: what is geographic satellite imagery and a list of then available satellite imagery sources. The Internet has a wealth of sources of downloadable GIS data.

Free GIS Data Geographic Information Systems Research Guides.

The ArcGIS Living Atlas of the World is the foremost collection of geographic information from This list of a topic centric public data sources in high quality. Data sources and input in GIS SlideShare. 3 USGS Earth Explorer. GIS Data Sources GIS In Ecology. This is a brief list of online GIS data sources. This is not meant to be a comprehensive list of available online geoscience GIS data, but rather a starting point for. Data Sources for U. S. GIS Data Research Guides at Dartmouth. GIS Data: Internet Data Sources for Foreign Countries. Find datasets for GIS Some of the resources are duplicated on the individual lists.

GIS Data Sources.

Select and download free geographic GIS data for any country in the world Sources Gazetteer, A gazetteer is a list of place names and their coordinates. Finding GIS Data and Preparing it for Use Yale University Library. This site lists data sources for using within GIS software. The National Map. The National Map is the product of a consortium of Federal, State,. Sorted by Geographic Region GIS Resources LibGuides at. Data Sources New York, Federal Government, United Nations, Other States, Elevation 10 & 30 m, ortho imagry, Geographic names, land use land cover,.

Pino - logical board game which is based on tactics and strategy. In general this is a remix of chess, checkers and corners. The game develops imagination, concentration, teaches how to solve tasks, plan their own actions and of course to think logically. It does not matter how much pieces you have, the main thing is how they are placement!


Geographic information systems

1999 Ontology - Driven Geographic Information Systems Proc. ACM International Symposium on Geographic Information Systems 14 19. CiteSeerX 10.1.1
Geographic Information Systems GIS are becoming an increasingly important component of business, healthcare, security, government, trade, media, transportation
The Canada Geographic Information System CGIS was an early geographic information system GIS developed for the Government of Canada beginning in the
A historical geographic information system also written as historical GIS or HGIS is a geographic information system that may display, store and analyze
Geographic information systems GIS play a constantly evolving role in geospatial intelligence GEOINT and United States national security. These technologies
and Geographic Information Systems London: Taylor and Francis Ghose 2001 Sieber, R. 2006. Public Participation and Geographic Information Systems A Literature
capture, represent, process, and analyze geographic information It can be contrasted with geographic information systems GIS which are software tools. British
of Geographic Information Laboratories for Europe AGILE was established to promote academic teaching and research on Geographic Information Systems at
A land Information System LIS is a geographic information system for cadastral and land - use mapping, typically used by local governments. A LIS consists
Information Systems Association URISA is a non - profit association of professionals using geographic information systems GIS and other information
including vector files, raster files, geographic databases, web files, and multi - temporal data. Geographic data and information are the subject of a number of

The American Cartographer, renamed in 1990 to Cartography and Geographic Information Systems and finally in 1999 to its current name. It also hosts the
The Geographic Names Information System GNIS is a database that contains name and locative information about more than two million physical and cultural
Information systems IS are formal, sociotechnical, organizational systems designed to collect, process, store, and distribute information In a sociotechnical
The BC Geographical Names formerly BC Geographical Names Information System or BCGNIS is a geographic name web service and database for the Canadian
its value. Information Systems Business intelligence tools Community indicators Geographic Information Systems Enterprise Information System Knowledge
The Institute of Geographical Information Systems IGIS was established by the National University of Sciences and Technology, Pakistan NUST to focus
includes sensor data, analytics, and mapping SDK. Comparison of geographic information systems software GIS Live DVD Open Source Geospatial Foundation OSGeo
for GIR systems and provided participants a chance to compare systems Purves, Ross Jones, Christopher 2011 - 07 - 01 Geographic Information Retrieval
as an extension of critical and participatory approaches to geographic information systems Some examples of this phenomenon are WikiMapia, OpenStreetMap
National Historical Geographic Information Systems citations at Google Scholar National Historical Geographic Information Systems online Forum at Google
geospatial realm, including geographic information systems geographic information science, remote sensing, satellite navigation systems and cartography. However
coordinate reference systems and support has been implemented by several standards - based geographic information systems Spatial reference systems can be referred


GIS Mapping and Data Management

Geographic Information Systems (GIS) and data management are critical to project success by providing effective storage, management and distribution of valuable data.

Our GIS mapping services allow you to visualise and understand data, by identifying relationships, patterns and trends by providing a digital platform for viewing and processing layers of spatial information.

GIS Services
We offer a unique foundation of modelling, environmental and asset management experience from a wide range of industry sectors which enables us to provide quality solutions to our clients’ problems. This is supported by a proven track record of web and standalone GIS systems for differing operational landscapes and our ability to transfer expertise across sectors.

Intertek continuously develops innovative tools to ensure effective project specific applications. We work with clients to meet their specific needs. These tools analyse and relate complex data sets and provide a unique visual experience for clients.

Data Management
Our experts support you during feasibility and constraints analysis, site selection, data collation and management as well as data interpretation and analysis, data conversion, integration and interoperability and real time monitoring.
Intertek provides you with a range of products and services to serve your specific needs. These may include Interactive WebGIS Solutions, document management systems or asset management systems.

Our Total Quality Assurance experts work with a broad spectrum of data and seamlessly integrate these to provide clients with a coherent picture of their operations. This includes survey data, monitoring data, modelling data and data from third parties.

Our GIS mapping and data management services provide consolidation of information and improved understanding of each data set and how it relates to other data within a spatial and temporal context. Intertek has strict quality assurance protocols and our systems and applications are developed using leading industry standard GIS software with specialists that carry a breadth of expertise in capturing and extracting maximum value from your geo spatial data and provide custom GIS solutions.


Transcript

NICK BEARMAN: I'm Nick Bearman, and I work at UCL as a teaching fellow in the Department of Geography, and I also run geospatial training solutions, where I run training sessions helping people understand spatial data. [What is spatial geographic data, and how did you become interested in the field?] I'm really interested in how we can use spatial data to solve some of the geographical problems

NICK BEARMAN [continued]: facing the world. Probably nearly every research project has got some element of spatial data, location data, within it-- whether it be where people are, like in the census where people live where people go to work where they get shopping how you interact with other people. It's a whole area of epidemiology, looking at disease and how disease spreads through a population.

NICK BEARMAN [continued]: So it's a really wide field that can cover so many different areas. I was originally very interested in geography, and quite an interest in computers, as well. And this was kind of before GIS was as mainstream as it was now. And it seemed like the real combination of the two areas. So the geography from kind of how

NICK BEARMAN [continued]: the world works, and computers allowing us to manipulate a whole range of data, and how we can use that together to try and understand the world better. [What is a geographic information system (GIS), and how can it be used?] GIS is geographic information systems,

NICK BEARMAN [continued]: or geographic information science, depending on which bit you want to use. And it's all about the technology behind maps and behind spatial data. So the systems side is how we collect the data, how we map the data, how we put it on the map, and the technology and the software behind that. So it can include GPS, include SATNAV, things like Google

NICK BEARMAN [continued]: Earth, that sort of thing. The science side-- geographic information science-- is more the academic side. So how do we come up with a new methods? How do we store this data we've got as spatial data? How do we store the coordinates? What are coordinates? What's the projection system? How do we put these things on a map? And how do we go from our 3D globe

NICK BEARMAN [continued]: that we've got-- how do we put that on a flat, 2D map? GIS can be used to answer any question that involves where. So a lot of the areas that I work with will be very interested in where people are. And this can be range of temporal scales. So whether it be like for a census-- so every 10 years, we get a very detailed snapshot

NICK BEARMAN [continued]: of where people live and a lot of information about them, all the way through to much finer temporal information. So for example, we can use mobile phones to track footfall in a shopping center. So how long do different people spend in certain shops? Which areas of the shops are more popular? Where are you most likely to see? And that, you can kind of get data every 30 seconds, almost.

NICK BEARMAN [continued]: [How have developments in technology improved GIS?] With census data, originally it was very much tabular-based. So computerized, but very little geography. With the developments in computer power, we can look at the spatial element of the geography and the census data much more easily,

NICK BEARMAN [continued]: and gain much more information. But we're still probably looking at 100,000 data points, or something like that. So over the last kind of five years, we've come into much more of the big data area. So this can use data from a range of sources. So for example, we can do some very interesting analysis with loyalty card data.

NICK BEARMAN [continued]: And for a kind of a big brand [INAUDIBLE] a loyalty card, you're looking at tens of millions of records. And we couldn't do that five years ago. We didn't have the computer power to do that. But now we can do some very interesting analysis with that. Look at spending patterns. And look at where people live versus where they go shopping, or whether they do click and collect.

NICK BEARMAN [continued]: Where do they pick up their purchases from? [What types of data can be collected and analyzed using GIS?] You can get spatial data in many types. The census is one of the kind of key underpinning ones. Certainly in the last 20 or 30 years, the census has been a key kind of geographic data source.

NICK BEARMAN [continued]: But we also have a range of surveys. So the British Household Panel Survey is a very good source. It's very detailed, but the geography element of it is sometimes a little bit lacking, depending exactly what you want to do. Because we don't have that many people in each geographic zone. We can also do some very interesting analysis with Twitter data, looking at either geo-located tweets--

NICK BEARMAN [continued]: so these tweets people send with the GPS on their phone on. So it gives you a set of coordinates for where they sent that tweet from. Or looking at Twitter profiles to see where people put as their hometown, and to see how they're distributed and what sort of tweets people do. And Twitter is a great resource for this sort of thing because the data is all freely available.

NICK BEARMAN [continued]: It's all public, so we can do quite a lot of work with that. There's a whole range of different data sources that we can use spatial data with. So this covers everything from things like the census and household surveys all the way through to LIDAR surveys and remote sensing. So the availability of data has increased dramatically

NICK BEARMAN [continued]: over the last five years. And even within the last two or three years, the availability of remotely sensed data-- so satellite data-- is now quite amazing. And the detail that we can get from that is very high. So we can do lots of interesting environmental analysis looking at, whether it be vegetation change, whether it be wildfires

NICK BEARMAN [continued]: and how they're spreading, even through to more kind of social science aspects. So there's a company launching what they call cube sats, which are very small satellites about 30 centimeters by 10 by 10. So kind of very, very small. And they've got quite high resolution cameras. And they they're looking at can they

NICK BEARMAN [continued]: use them to monitor traffic in London, in terms of where cars are, and parking spaces, as well? So how does the traffic flow and where are they most likely to get a parking space? So we can apply these sorts of data across the whole range of geography-- everything from social science, human geography, all the way through to physical geography and environmental science.

NICK BEARMAN [continued]: [What computational skills and methods do you need when working with large scale data?] The biggest skill anyone can get if they're looking to do this sort of thing is coding of some description. Because there are a lot of GIS tools out there. But to use some of these new data sources, we need to be able to create these tools, as well.

NICK BEARMAN [continued]: So the skills of coding are key to this. Probably the two biggest languages to know would be R and Python. And they both can do some really good data science with geographic data. So I would recommend to anybody interested in that area, have a look at learning one or both of those. And even if you don't think of yourself

NICK BEARMAN [continued]: as a programmer or a coder, you can get into it and learn it. They're not very hard to learn. The kind of learning curve is a little bit steep, but it's well worth it. Because it allows you to do so much more with the data sets we have available. There are lots of sorts of analysis you can do with spatial data. One of the kind of key building blocks

NICK BEARMAN [continued]: is choropleth maps, particularly looking at rates of different variables. So disease is quite common one. There's a whole range you can do with that. And so understanding how those work, and how you can create a choropleth map is a great way of communicating data. And it also explains some of the kind of theory

NICK BEARMAN [continued]: behind GIS, and why we need to collect data in certain ways. Heat maps do come up quite frequently. And they're a very useful way of getting a quick overview of the data. So they can make very pretty maps quite quickly. But there's a limit to what you can do from a scientific point of view with those, in terms of interpreting them.

NICK BEARMAN [continued]: [What are the key things to keep in mind when visualizing data?] When you're visualizing geographic data, it's very important to make sure you know something about the data set you're visualizing. So where does it come from? And what can you do with it? Because we collect data sets differently,

NICK BEARMAN [continued]: depending on what we're looking for. And so you can only use certain data sets for certain things. So rather than just finding some data on the internet and chucking it on a map, you need to actually have a look at the data. Think about what is it showing? What is it collecting? So being critical is very key to that. Another thing to be aware of is, how old the data are.

NICK BEARMAN [continued]: Again, it depends on what you're trying to show, what message you're trying to communicate. But being aware of how old the data might be, how up to date it is, is quite important. Another thing to bear in mind about data visualization, particularly with choropleth maps, is to make sure you're visualizing rates, or say, a rate of disease per 1,000 population,

NICK BEARMAN [continued]: or whatever it might be, rather than just the count of people. Because if you're trying to show how the rate varies across, for example, the UK, if you just see the number of cases, then we just get a map of population. It doesn't tell you anything, you know. It tells you most of the people are in big cities, say London or Birmingham. Which, we know that already. So it's really important to make sure to use

NICK BEARMAN [continued]: the rate, or the per capita, for the data. [How do you manage large scale geographic data?] When working with geographic data, it's very easy to end up with lots of data. So it's good to have a system in place for managing that. There is a list as long as your arm of different formats

NICK BEARMAN [continued]: of spatial data. And in some ways, the format you use isn't too vital, as long as you know what the data are. And probably the best thing I can recommend to anybody is make sure you have a good filing system in your My Documents folder. So have different folders for different things. And make sure you know what you're saving where.

NICK BEARMAN [continued]: Spatial data is normally stored as files or databases, so files are probably the easiest way to start off with. But as soon as you get into any bigger projects, you'll be ending up doing stuff with databases. So if you want to go into the area, PostgreSQL is probably one of the most common ones. So that will come up a lot. So it's worth having a look at that and seeing how that works.

NICK BEARMAN [continued]: It's not difficult to set up. And it's just a different way of thinking about data and how it's stored. So again, spend just a bit of time thinking about how it's stored, how it's organized. And you can get a long way with that. And databases are going to become much more prevalent when we're looking at accessing data through the internet and real time data. Because 95% of data on the internet

NICK BEARMAN [continued]: is stored in databases of some form. So accessing that database and using that data is a key skill. [Tell us about your project "PopChange."] So one of the projects I've been working on recently is a project called PopChange, where we are looking at census data. But we want to do small-area comparisons over time.

NICK BEARMAN [continued]: In the UK, the smallest area of census data that we get is called an output area. And there are about 100 people or so, give or take. But over the last kind of five censuses-- so 2011, 2001, all the way back to 1971-- these changed slightly. Sometimes quite dramatically. Sometimes only slightly. But it makes comparing these small areas over time

NICK BEARMAN [continued]: quite difficult. So in the PopChange Project, we reallocated populations from our output area geographies to a one-kilometer grid across the whole country. And that's a consistent grid over time. So that allows us to do this small area comparison to see how a smaller community, or a small area of the city,

NICK BEARMAN [continued]: has changed over these last 50 years. The data management for that was quite intense because of the amount of data that we've had. And we used a mixture of databases and files for that. And one of the key skills for creating that was using R.

NICK BEARMAN [continued]: So we used the R programming language to process the data and manage the data. So that's a really important aspect. [Did you learn anything unexpected while working on this project?] Whenever you're dealing with the area of computer science, even from a GIS point of view, where

NICK BEARMAN [continued]: we're using computer science in combination with others, the rate of change is quite amazing, really. And the different technologies that come across, they change very rapidly. So we were working with a software developer, and he used a language called Closure to create the web interface for the PopChange Project.

NICK BEARMAN [continued]: I'd never come across that before. It's one of many languages out there. And it was quite well-suited for what he wanted to do. But he'd actually also only used that language a few times. So as a part of the project, it was learning how that works. But he's proficient in a whole range of other languages, so it's very easy for him to adapt. And that's the common theme throughout, certainly,

NICK BEARMAN [continued]: geographic data science and GIS, as well. You'll always get new technologies and languages coming along. So you always keep learning. So I've been using R for quite a few years now. But I've started to do more with Python now. And no doubt, in another two or three years, there'll be another language that comes along,

NICK BEARMAN [continued]: or another way of doing things. So part of the key element of this is to make sure you keep your skills up to date, and sort of see what the new things are that are coming about. And you will end up spending time having to learn them. But that goes for everyone. Nobody ever stops learning in this area. [What tools would you recommend using for this type

NICK BEARMAN [continued]: of research?] There's a whole range of different tools that we can use to create maps. One of the things you'll get used to is how different tools work. So I can really recommend trying different tools, whether that will be just having a website

NICK BEARMAN [continued]: and see how they work. If there's a training opportunity that comes along to your university or department or wherever you're based, go along to that and see how it works. And spend some time playing about with the software. One particular tool I found really useful, particularly for choropleth maps is something called ColorBrewer. And that's a website that helps you choose colors that

NICK BEARMAN [continued]: work well for choropleth maps. And other very common tools are becoming very popular, particularly open source ones. So we've mentioned R already. QGIS is a very good open source GIS program that you can run on your desktop. So I'd really recommend checking that out. And there are a whole range of web-based tools

NICK BEARMAN [continued]: for doing web maps. So that Mapbox and Leaflet are very, very common ones-- very popular. And CARTO is one that's come on the scene relatively recently, and is worth checking out. But there will always be more tools coming along. So there's no such thing as a definite list of tools. There will always be new things to add. [What advice would you give people looking to do research

NICK BEARMAN [continued]: using GIS?] If you're quite new to using spatial data, the first thing I'd recommend is, have a look at Google Earth, if you haven't already. It's an amazing piece of software. And when it was launched, it revolutionized how everyone perceived spatial data. Because before that, it was quite

NICK BEARMAN [continued]: difficult to get hold of spatial data. You needed very specialist software. When that came out, it allowed anybody to look at the globe as it is, and zoom in on different locations and see what their local area looked like, see what their house looked like. And that is a very basic GIS. It doesn't have all the kind of editing capabilities. But it still manages spatial data.

NICK BEARMAN [continued]: So that's a great resource to start with. And it's also worth thinking about, what data do you have in your field? Because with the research you've done already, you will be using data. I guarantee that you'll do something with data. And 90% of all data has got some spatial component. So it might be nice and easy, like an address, or a root,

NICK BEARMAN [continued]: or a location of data that you've collected. And that we can put on a map very easily. It might be something a bit more not so directly spatial. So if you're doing textual analysis, you might be looking at location names, different locations that are mentioned in a book, or something like that, perhaps. Or you might have to think a little bit more laterally.

NICK BEARMAN [continued]: And so if you're dealing with customers, where do they use your service, if that's the area? And where are they based? How do they travel to get to your service? So there's a whole range of different data sets that have a spatial element. And you know the data set you're using very well.

NICK BEARMAN [continued]: So there's probably some kind of geographic element of that. And you can bring those skills to the table when you work with some working in GIS, or when you create your own map. So there's a whole range of different skills you can bring to that. [MUSIC PLAYING]

Video Info

Series Name: Nick Bearman

Publisher: SAGE Publications Ltd

Publication Year: 2019

Segment Info

Segment Num.: 1

Persons Discussed:

Events Discussed:


Contents

  • Offshoring is moving the work to a distant country. If the distant workplace is a foreign subsidiary/owned by the company, then the offshore operation is a captive, [12] sometimes referred to as in-house offshore.[13]
  • Offshore outsourcing is the practice of hiring an external organization to perform some business functions ("Outsourcing") in a country other than the one where the products or services are actually performed, developed or manufactured ("Offshore"). [14] entails bringing processes handled by third-party firms in-house, and is sometimes accomplished via vertical integration.
  • Nearshoring refers to outsource to a nearby country.
  • Farmshoring refers to outsourcing to companies in more rural locations within the same country. [15]
  • Homeshoring (also known as
  • Homesourcing ) is a form of IT-enabled "transfer of service industry employment from offices to home-based . with appropriate telephone and Internet facilities". [16][17] These telecommuting positions may be customer-facing or back-office, [18] and the workers may be employees or independent contractors.
  • In-housing refers to hiring employees [19][20] or using existing employees/resources to undo an outsourcing. [21][22]
  • An Intermediary is when a business provides a contract service to another organization while contracting out that same service. [23][24]

Acronyms Edit

Some of the acronyms related to BPO (Business Process Outsourcing) are:

  • EPO -
  • Engineering Process Outsourcing
  • ITO -
  • Information Technology Outsourcing
  • KPO - Knowledge Process Outsourcing
  • LPO - Legal Process Outsourcing[25]
  • RPO - Recruitment Process Outsourcing
  • HRO - Human Resource Outsourcing

Motivations Edit

Global labor arbitrage can provide major financial savings from lower international labor rates, which could be a major motivation for offshoring. Cost savings from economies of scale and specialization can also motivate outsourcing, even if not offshoring. Since about 2015 indirect revenue benefits have increasingly become additional motivators. [26] [27] [28]

Outsourcing can offer greater budget flexibility and control by allowing organizations to pay for the services and business functions they need, when they need them. It is often perceived to reduce hiring and training specialized staff, to make available specialized expertise, and to decrease capital, operating expenses, [33] and risk.

"Do what you do best and outsource the rest" has become an internationally recognized business tagline first "coined and developed" [34] in the 1990s by management consultant Peter Drucker. The slogan was primarily used to advocate outsourcing as a viable business strategy. Drucker began explaining the concept of "Outsourcing" as early as 1989 in his Wall Street Journal article entitled "Sell the Mailroom". [35]

Sometimes the effect of what looks like outsourcing from one side and insourcing from the other side can be unexpected: The New York Times reported in 2001 that "6.4 million Americans .. worked for foreign companies as of 2001, [but] more jobs are being outsourced than" [the reverse]. [36]

Agreements Edit

Two organizations may enter into a contractual agreement involving an exchange of services, expertise, and payments. Outsourcing is said to help firms to perform well in their core competencies, fuel innovation, and mitigate a shortage of skill or expertise in the areas where they want to outsource. [37]

20th century Edit

Following the adding of management layers in the 1950s and 1960s to support expansion for the sake of economy of scale, corporations found that agility and added profits could be obtained by focusing on core strengths the 1970s and 1980s were the beginnings of what later was named outsourcing. [38] Kodak's 1989 "outsourcing most of its information technology systems" [39] was followed by others during the 1990s. [39]

In 2013, the International Association of Outsourcing Professionals gave recognition to Electronic Data Systems Corporation's Morton H. Meyerson [40] who, in 1967, proposed the business model that eventually became known as outsourcing. [41]

IT-enabled services offshore outsourcing Edit

Growth of offshoring of IT-enabled services, although not universally accepted, [42] [43] both to subsidiaries and to outside companies (offshore outsourcing) is linked to the availability of large amounts of reliable and affordable communication infrastructure following the telecommunication and Internet expansion of the late 1990s. [44] Services making use of low-cost countries included

  • back-office and administrative functions, such as finance and accounting, HR, and legal
  • call centers and other customer-facing departments, such as marketing and sales services
  • IT infrastructure and application development
  • knowledge services, including engineering support, [45] product design, research and development, and analytics.

Early 21st century Edit

In the early 21st century, businesses increasingly outsourced to suppliers outside their own country, sometimes referred to as offshoring or offshore outsourcing. Other options subsequently emerged: nearshoring, crowdsourcing, multisourcing, [46] [47] strategic alliances/strategic partnerships, strategic outsourcing. [48]

From Drucker's perspective, a company should only seek to subcontract in those areas in which it demonstrated no special ability. [49] The business strategy outlined by his slogan recommended that companies should take advantage of a specialist provider's knowledge and economies of scale to improve performance and achieve the service needed. [50]

In 2009, by way of recognition, Peter Drucker posthumously received a significant honor when he was inducted into the Outsourcing Hall of Fame for his outstanding work in the field. [49]

Limitations due to growth Edit

Inflation, high domestic interest rates, and economic growth pushed India's IT salaries 10 - 15%, making some jobs relatively "too" expensive, compared to other offshoring destinations. Areas for advancing within the value chain included research and development, equity analysis, tax-return processing, radiological analysis, and medical transcription.

Offshore alternatives Edit

Japanese companies outsourced to China, particularly to formerly Japanese-occupied cities. [51] German companies have outsourced to Eastern European countries with German-language affiliation, such as Poland and Romania. [52] French companies outsource to North Africa for similar reasons.

For Australian IT companies, Indonesia is one of the major choice of offshoring destination. Near-shore location, common time zone and adequate IT work force are the reasons for offshoring IT services to Indonesia.

Growth of white-collar outsourcing Edit

Although offshoring initially focused on manufacturing, white-collar offshoring/outsourcing has grown rapidly since the early 21st century. The digital workforce of countries like India and China are only paid a fraction of what would be minimum wage in the US. On average, software engineers are getting paid between 250,000 and 1,500,000 rupees (US$4,000 to US$23,000) in India as opposed to $40,000–$100,000 in countries such as the US and Canada. [53] Closer to the US, Costa Rica has become a big source for the advantages of a highly educated labor force, a large bilingual population, stable democratic government, and similar time zones with the United States. It takes only a few hours to travel between Costa Rica and the US. Companies such as Intel, Procter & Gamble, HP, Gensler, Amazon and Bank of America have big operations in Costa Rica. [54]

Unlike outsourced manufacturing, outsourced white collar workers can choose their working hours, and for which companies to work. Clients benefit from telecommuting, reduced office space, management salary, and employee benefits as these individuals are contracted workers. [55]

However, ending a government oursourcing arrangement has its difficulties too. [56]

While U.S. companies do not outsource to reduce high top level executive or managerial costs, [57] they primarily outsource to reduce peripheral and "non-core" business expenses. [58] Further reasons are higher taxes, high energy costs, and excessive government regulation or mandates.

Mandated benefits like social security, Medicare, and safety protection (OSHA regulations) are also motivators. [59] By contrast, executive pay in the United States in 2007, which could exceed 400 times more than average workers — a gap 20 times bigger than it was in 1965 [57] is not a factor. [60]

Other reasons include reducing and controlling operating costs, improving company focus, gaining access to world-class capabilities, tax credits, [61] freeing internal resources for other purposes, streamlining or increasing efficiency for time-consuming functions, and maximizing use of external resources. For small businesses, contracting/subcontracting/"outsourcing" might be done to improve work-life balance [62]

There are many outsourcing models, with variations [63] by country, [64] year [65] [66] and industry. [67]

Another approach is to differentiate between tactical and strategic outsourcing models. Tactical models include:

Innovation outsourcing Edit

When offshore outsourcing knowledge work, firms heavily rely on the availability of technical personnel at offshore locations. One of the challenges in offshoring engineering innovation is a reduction in quality. [69]

Co-sourcing Edit

Co-sourcing is a hybrid of internal staff supplemented by an external service provider. [70] [71] Co-sourcing can minimize sourcing risks, increase transparency, clarity and lend toward better control than fully outsourced. [72]

Co-sourcing services can supplement internal audit staff with specialized skills such as information risk management or integrity services, or help during peak periods, or similarly for other areas such as software development or human resources.

Identity management co-sourcing Edit

Identity management co-sourcing is when on-site hardware [73] [74] interacts with outside identity services.

This contrasts with an "all in-the-cloud" service scenario, where the identity service is built, hosted and operated by the service provider in an externally hosted, cloud computing infrastructure.

Offshore Software R&D Co-sourcing Edit

Offshore Software R&D is the provision of software development services by a supplier (whether external or internal) located in a different country from the one where the software will be used. The global software R&D services market, as contrasted to Information Technology Outsourcing (ITO) and BPO, is rather young and currently is at a relatively early stage of development. [75]

Countries involved in outsourced software R&D Edit

Canada, India, Ireland, and Israel were the four leading countries as of 2003. [75] Although many countries have participated in the Offshore outsourcing of software development, their involvement in co-sourced and outsourced Research & Development (R&D) was somewhat limited. Canada, the second largest by 2009, had 21% [76]

As of 2018, the top three were deemed by one "research-based policy analysis and commentary from leading economists" as China, India and Israel." [77]

Gartner Group adds in Russia, but does not make clear whether this is pure R&D or run-of-the-mill IT outsourcing. [78]

The main driver for offshoring development work has been the greater availability of developers at a lower cost than in the home country. However, the rise in offshore development has taken place in parallel with an increased awareness of the importance of usability, and the user experience, in software. Outsourced development poses special problems for development, i.e. the more formal, contractual relationship between the supplier and client, and geographical separation place greater distance between the developers and users, which makes it harder to reflect the users' needs in the final product. This problem is exacerbated if the development is offshore. Further complications arise from cultural differences, which apply even if the development is carried out by an in-house offshore team. [79]

Historically offshore development concentrated on back office functions but, as offshoring has grown, a wider range of applications have been developed. Offshore suppliers have had to respond to the commercial pressures arising from usability issues by building up their usability expertise. Indeed, this problem has presented an attractive opportunity to some suppliers to move up market and offer higher value services. [80] [81] [82]

Legal issues Edit

Offshore Software R&D means that company A turns over responsibility, in whole or in part, of an in-house software development to company B whose location is outside of company A's national jurisdiction. Maximizing the economic value of an offshore software development asset critically depends on understanding how best to use the available forms of legal regulations to protect intellectual rights. If the vendor cannot be trusted to protect trade secrets, then the risks of an offshoring software development may outweigh its potential benefits. Hence, it is critical to review the intellectual property policy of the potential offshoring supplier. The intellectual property protection policy of an offshore software development company must be reflected in these crucial documents: General Agreement Non-Disclosure Agreement Employee Confidentiality Contract. [83]

2000-2012 R&D Edit

As forecast in 2003, [84] R&D is outsourced. Ownership of intellectual property by the outsourcing company, despite outside development, was the goal. To defend against tax-motivated cost-shifting, the US government passed regulations in 2006 to make outsourcing research harder. [85] Despite many R&D contracts given to Indian universities and labs, only some research solutions were patented. [86]

While Pfizer moved some of its R&D from the UK to India, [87] a Forbes article suggested that it is increasingly more dangerous to offshore IP sensitive projects to India, because of India's continued ignorance of patent regulations. [88] In turn, companies such as Pfizer and Novartis, have lost rights to sell many of their cancer medications in India because of lack of IP protection.

Future trends Edit

A 2018 "The Future of Outsourcing" report began with "The future of outsourcing is digital." [89] The ""Do what you do best and outsource the rest" [34] approach means that "integration with retained systems" [89] is the new transition challenge - people training still exists, but is merely an "also."

There is more complexity than before, especially when the outside company may be an integrator. [89]

While the number of technically skilled labor grows in India, Indian offshore companies are increasingly tapping into the skilled labor already available in Eastern Europe to better address the needs of the Western European R&D market. [90]

Changed government outsourcing focus Edit

Forbes considered the US Presidential election of 2016 "the most disruptive change agent for the outsourcing industry", [91] especially the renewed "invest in America" goal highlighted in campaigning, but the magazine tepidly reversed direction in 2019 as to the outcome for employment. [92]

Furthermore, there are growing legal requirements for data protection, where obligations and implementation details must be understood by both sides. [89] [93] This includes dealing with customer rights. [94]

Performance measurement Edit

Focusing on software quality metrics is a good way to maintain track of how well a project is performing.

Management processes Edit

Globalization and complex supply chains, along with greater physical distance between higher management and the production-floor employees often requires a change in management methodologies, as inspection and feedback may not be as direct and frequent as in internal processes. This often requires the assimilation of new communication methods such as voice over IP, instant messaging, and Issue tracking systems, new time management methods such as time tracking software, and new cost- and schedule-assessment tools such as cost estimation software. [95] [96] [97]

The term Transition methodology [98] describes the process of migrating knowledge, systems, and operating capabilities between the two sides. [99]

Communications and customer service Edit

In the area of call center outsourcing, especially when combined with offshoring, [100] agents may speak with different linguistic features such as accents, word use and phraseology, which may impede comprehension. [101] [102] [103] [104]

Governance Edit

In 1979, Nobel laureate Oliver E. Williamson wrote that the governance structure is the "framework within which the integrity of a transaction is decided." Adding further that "because contracts are varied and complex, governance structures vary with the nature of the transaction." [105] University of Tennessee researchers have been studying complex outsourcing relationships since 2003. Emerging thinking regarding strategic outsourcing is focusing on creating a contract structure in which the parties have a vested interest in managing what are often highly complex business arrangements in a more collaborative, aligned, flexible, and credible way. [106] [107]

Security Edit

Reduced security, sometimes related to lower loyalty [108] may occur, even when "outsourced" staff change their legal status but not their desk. While security and compliance issues are supposed to be addressed through the contract between the client and the suppliers, fraud cases have been reported.

In April 2005, a high-profile case involved the theft of $350,000 from four Citibank customers when call-center workers acquired the passwords to customer accounts and transferred the money to their own accounts opened under fictitious names. Citibank did not find out about the problem until the American customers noticed discrepancies with their accounts and notified the bank. [109]

Information Technology Edit

Richard Baldwin's 2006 The Great Unbundling work was followed in 2012 by Globalization's Second Acceleration (the Second Unbundling) and in 2016 by The Great Convergence: Information Technology and the New Globalization. [110] It is here, rather than in manufacturing, that the bits economy can advance in ways that the economy of atoms and things can't: an early 1990s Newsweek had a half page cartoon showing someone who had just ordered a pizza online, and was seeking help to download it.

Step-in rights Edit

If both sides have a contract clause permitting step-in rights, [111] then there is a right, though not an obligation, [112] to take over a task that is not going well, or even the entire project.

A number of outsourcings and offshorings that were deemed failures [113] [114] [69] led to reversals [115] [116] signaled by use of terms such as Insourcing and reshoring. The New York Times reported in 2017 that IBM "plans to hire 25,000 more workers in the United States over the next four years," overlapping India-based Infosys's "10,000 workers in the United States over the next two years." [117] A clue to a tipping point having been reached was a short essay titled "Maybe You Shouldn’t Outsource Everything After All" [118] and the longer "That Job Sent to India May Now Go to Indiana."

Among problems encountered were supply-and-demand induced raises in salaries and lost benefits of similar-time-zone. Other issues were differences in language and culture. [117] [102] Another reason for a decrease in outsourcing is that many jobs that were subcontracted abroad have been replaced by technological advances. [119]

According to a 2005 Deloitte Consulting survey, a quarter of the companies which had outsourced tasks reversed their strategy. [119]

These reversals, however, did not undo the damage. New factories often:

  • were in different locations
  • needed different skill sets
  • used more automation. [120]

Public opinion in the US and other Western powers opposing outsourcing was particularly strengthened by the drastic increase in unemployment as a result of the 2007–2008 financial crisis. From 2000 to 2010, the US experienced a net loss of 687,000 jobs due to outsourcing, primarily in the computers and electronics sector. Public disenchantment with outsourcing has not only stirred political responses, as seen in the 2012 US presidential campaigns, but it has also made companies more reluctant to outsource or offshore jobs. [119]

A counterswing depicted by a 2016 Deloitte survey suggested that companies are no longer reluctant to outsource. [121] Deloitte's survey identified three trends:

  • Companies are broadening their approach to outsourcing as they begin to view it as more than a simple cost-cutting play
  • Organizations are "redefining the ways they enter into outsourcing relationships and manage the ensuing risks".
  • Organizations are changing the way they are managing their relationships with outsourcing providers to "maximize the value of those relationships".

Insourcing Edit

Insourcing is the process of reversing an outsourcing, possibly using help from those not currently part of the inhouse staff. [122] [123] [124]

Outsourcing has gone through many iterations and reinventions, and some outsourcing contracts have been partially or fully reversed. Often the reason is to maintain control of critical production or competencies, and insourcing is used to reduce costs of taxes, labor and transportation. [125]

Regional insourcing, a related term, is when a company assigns work to a subsidiary that is within the same country. This differs from onshoring and reshoring: these may be either inside or outside the company.

Regional insourcing Edit

Regional insourcing is a process in which a company establishes satellite locations for specific entities of their business, making use of advantages one state may have over another [126] [127] This concept focuses on the delegating or reassigning of procedures, functions, or jobs from production within a business in one location to another internal entity that specializes in that operation. This allows companies to streamline production, boost competency, and increase their bottom line.

This competitive strategy applies the classical argument of Adam Smith, which posits that two nations would benefit more from one another by trading the goods that they are more proficient at manufacturing. [128] [129]

Net effect on jobs Edit

To those who are concerned that nations may be losing a net number of jobs due to outsourcing, some [130] point out that insourcing also occurs. A 2004 study [131] in the United States, the United Kingdom, and many other industrialized countries more jobs are insourced than outsourced. The New York Times disagreed, and wrote that free trade with low-wage countries is win-lose for many employees who find their jobs offshored or with stagnating wages. [132]

The impact of offshore outsourcing, according to two estimates published by The Economist showed unequal effect during the period studied 2004 to 2015, ranging from 150,000 to as high as 300,000 jobs lost per year. [133]

In 2010, a group of manufacturers started the Reshoring Initiative, focusing on bringing manufacturing jobs for American companies back to the country. Their data indicated that 140,000 American jobs were lost in 2003 due to offshoring. Eleven years later in 2014, the United States recovered 10,000 of those offshored positions this marked the highest net gain in 20 years. [134] More than 90% of the jobs that American companies "offshored" and outsourced manufacturing to low cost countries such as China, Malaysia and Vietnam did not return. [134]

Insourcing crossbreeds Edit

In the United States Edit

A 2012 series of articles in Atlantic Magazine [138] [139] [140] [141] highlighted a turning of the tide for parts of the USA's manufacturing industry. Specific causes identified include rising third-world wages, recognition of hidden off-shoring costs, innovations in design/manufacture/assembly/time-to-market, increasing fuel and transportation costs, falling energy costs in the US, increasing US labor productivity, and union flexibility. Hiring at GE's giant Appliance Park in Louisville increased 90% during 2012.

Standpoint of labor Edit

From the standpoint of labor, outsourcing may represent a new threat, contributing to worker insecurity, and is reflective of the general process of globalization and economic polarization. [142]

  • Low-skilled work: Outsourced low-skilled work to contractors who tend to employ migrant labor [143] is causing a revival of radical trade union activity. In the UK, major hospitals, universities, [144] ministries and corporations are being pressured outsourced workers often earn minimum wage and lack sick pay, annual leave, pensions and other entitlements enjoyed by directly employed staff at the same workplaces. [145]
  • In-housing: In January 2020, Tim Orchard, the CEO of Imperial College Healthcare Trust stated that the in-housing of over 1,000 Sodexo cleaners, caterers and porters across five NHS hospitals in London "will create additional cost pressures next year but we are confident that there are also benefits to unlock, arising from better team working, more co-ordinated planning and improved quality." [146]
  • USA base: On June 26, 2009, Jeff Immelt, the CEO of General Electric, called for the United States to increase its manufacturing base employment to 20% of the workforce, commenting that the U.S. has outsourced too much and can no longer rely on consumer spending to drive demand. [147]

Standpoint of government Edit

Western governments may attempt to compensate workers affected by outsourcing through various forms of legislation. In Europe, the Acquired Rights Directive attempts to address the issue. The Directive is implemented differently in different nations. In the United States, the Trade Adjustment Assistance Act is meant to provide compensation for workers directly affected by international trade agreements. Whether or not these policies provide the security and fair compensation they promise is debatable.

Government response Edit

In response to the recession, President Obama launched the SelectUSA program in 2011. In January 2012, President Obama Issued a Call to Action to Invest in America at the White House "Insourcing American Jobs" Forum. [148] Obama met with representatives of companies such as Otis Elevator, Apple, DuPont, Master Lock, all of which had recently brought jobs back or made significant investments in the United States.

Policy-making strategy Edit

A main feature of outsourcing influencing policy-making is the unpredictability it generates regarding the future of any particular sector or skill-group. The uncertainty of future conditions influences governance approaches to different aspects of long-term policies.

In particular, distinction is needed between

  • cyclical unemployment – for which pump it up solutions have worked in the past, and
  • structural unemployment – when "businesses and industries that employed them no longer exist, and their skills no longer have the value they once did." [120]
Competitiveness Edit

A governance that attempts adapting to the changing environment will facilitate growth and a stable transition to new economic structures [149] until the economic structures become detrimental to the social, political and cultural structures.

Automation increases output and allows for reduced cost per item. When these changes are not well synronized, unemployment or underemployment is a likely result. When transportation costs remain unchanged, the negative effect may be permanent [120] jobs in protected sectors may no longer exist. [150]

USA outsourcing's effect on Mexico, studies suggest, is that for every 10% increase in US wages, north Mexico cities along the border experienced wage rises of 2.5%, about 0.69% higher than in inner cities. [151]

By contrast, higher rates of saving and investment in Asian countries, along with rising levels of education, studies suggest, fueled the ‘Asian miracle’ rather than improvements in productivity and industrial efficiency. There was also an increase in patenting and research and development expenditures. [152]

Industrial policy Edit

Outsourcing results from an internationalization of labor markets as more tasks become tradable. According to leading economist Greg Mankiw, the labour market functions under the same forces as the market of goods, with the underlying implication that the greater the number of tasks available to being moved, the better for efficiency under the gains from trade. With technological progress, more tasks can be offshored at different stages of the overall corporate process. [14]

The tradeoffs are not always balanced, and a 2004 viewer of the situation said "the total number of jobs realized in the United States from insourcing is far less than those lost through outsourcing." [153]

Environmental policy Edit

Import competition has caused a de facto ‘race-to-the-bottom’ where countries lower environmental regulations to secure a competitive edge for their industries relative to other countries.

As Mexico competes with China over Canadian and American markets, its national Commission for Environmental Cooperation has not been active in enacting or enforcing regulations to prevent environmental damage from increasingly industrialized Export Processing Zones. Similarly, since the signing of NAFTA, heavy industries have increasingly moved to the US which has a comparative advantage due to its abundant presence of capital and well-developed technology. A further example of environmental de-regulation with the objective of protecting trade incentives have been the numerous exemptions to carbon taxes in European countries during the 1990s.

Although outsourcing can influence environmental de-regulatory trends, the added cost of preventing pollution does not majorly determine trade flows or industrialization. [154]

Success stories Edit

Companies such as ET Water Systems (now a Jain Irrigation Systems company), [155] GE Appliances and Caterpillar found that with the increase of labor costs in Japan and China, the cost of shipping and custom fees, it cost only about 10% more to manufacture in America. [119] Advances in technology and automation such as 3D printing technologies [156] have made bringing manufacturing back to the United States, both cost effective and possible. Adidas, for example, plans producing highly customized shoes with 3D printers in the U.S. [157]

Globalization and socio-economic implications Edit

Industrialization Edit

Outsourcing has contributed to further levelling of global inequalities as it has led to general trends of industrialization in the Global South and deindustrialization in the Global North.

Not all manufacturing should return to the U.S. [158] The rise of the middle class in China, India and other countries has created markets for the products made in those countries. Just as the U.S. has a "Made in U.S.A." program, other countries support products made in their countries as well. Localization, the process of manufacturing products for the local market, is an approach to keeping some manufacturing offshore and bringing some of it back. Besides the cost savings of manufacturing closer to the market, the lead time for adapting to changes in the market is faster.

The rise in industrial efficiency which characterized development in developed countries has occurred as a result of labor-saving technological improvements. Although these improvements do not directly reduce employment levels but rather increase output per unit of work, they can indirectly diminish the amount of labor required for fixed levels of output. [159]

Growth and income Edit

It has been suggested that "workers require more education and different skills, working with software rather than drill presses" rather than rely on limited growth labor requirements for non-tradable services. [120]

United States Edit

Protection of some data involved in outsourcing, such as about patients (HIPAA) is one of the few federal protections. [160]

"Outsourcing" is a continuing political issue in the United States, having been conflated with offshoring during the 2004 U.S. presidential election. The political debate centered on outsourcing's consequences for the domestic U.S. workforce. Democratic U.S. presidential candidate John Kerry called U.S. firms that outsource jobs abroad or that incorporate overseas in tax havens to avoid paying their "fair share" of U.S. taxes "Benedict Arnold corporations".

A Zogby International August 2004 poll found that 71% of American voters believed "outsourcing jobs overseas" hurt the economy while another 62% believed that the U.S. government should impose some legislative action against these companies, possibly in the form of increased taxes. [161] [162] President Obama promoted the 'Bring Jobs Home Act' to help reshore jobs by using tax cuts and credits for moving operations back to the USA. [163] The same bill was reintroduced in the 113th United States Congress as the Bring Jobs Home Act (S. 2569 113th Congress). [164] [165]

While labor advocates claim union busting as one possible cause of outsourcing, [166] another claim is high corporate income tax rate in the U.S. relative to other OECD nations, [167] [168] [ needs update ] and the practice of taxing revenues earned outside of U.S. jurisdiction, a very uncommon practice. Some counterclaim that the actual taxes paid by US corporations may be considerably lower than "official" rates due to the use of tax loopholes, tax havens, and "gaming the system". [169] [170]

Sarbanes-Oxley has also been cited as a factor.

Europe Edit

Council Directive 77/187 of 14 February 1977 protects employees' rights in the event of transfers of undertakings, businesses or parts of businesses (as amended 29 June 1998, Directive 98/50/EC & 12 March 2001's Directive 2001/23). Rights acquired by employees with the former employer are to be safeguarded when they, together with the undertaking in which they are employed, are transferred to another employer, i.e., the contractor.

Case subsequent to the European Court of Justice's Christel Schmidt v. Spar- und Leihkasse der früheren Ämter Bordesholm, Kiel und Cronshagen, Case C-392/92 [1994] have disputed whether a particular contracting-out exercise constituted a transfer of an undertaking (see, for example, Ayse Süzen v. Zehnacker Gebäudereinigung GmbH Krankenhausservice, Case C-13/95 [1997]). In principle, employees may benefit from the protection offered by the directive.

Asia Edit

Countries that have been the focus of outsourcing include India, Pakistan, and the Philippines for American and European companies, and China and Vietnam for Japanese companies.

The Asian IT service market is still in its infancy, but in 2008 industry think tank Nasscom-McKinsey predicted a $17 billion IT service industry in India alone. [171]

A China-based company, Lenovo, outsourced/reshored manufacturing of some time-critical customized PCs to the U.S. since "If it made them in China they would spend six weeks on a ship." [119]

Article 44 of Japan's Employment Security Act implicitly bans the domestic/foreign workers supplied by unauthorized companies regardless of their operating locations. The law will apply if at least one party of suppliers, clients, labors reside in Japan, and if the labors are the integral part of the chain of command by the client company, or the supplier.

  • No person shall carry out a labor supply business or have workers supplied by a person who carries out a labor supply business work under his/her own directions or orders, except in cases provided for in the following Article.
    • A person who falls under any of the following items shall be punished by imprisonment with work for not more than one year or a fine of not more than one million yen. (Article 64)

    Victims can lodge a criminal complaint against the CEO of the suppliers and clients. The CEO risks arrest, and the Japanese company may face a private settlement with financial package in the range between 20 and 100 million JPY (200,000 - million USD).

    When the New York Times headlined "Near Source of Supplies the Best Policy" [173] their main focus was on "cost of production." Although transportation cost was addressed, they did not choose among:

    • transporting supplies to place of production [174]
    • transporting finished goods to place(s) of sale
    • cost and availability of labor

    Nearshoring or Nearsourcing is having business processes, especially information technology processes such as application maintenance and development or testing, in a nearby country, often sharing a border with the target country. Commonalities usually include: geographic, temporal (time zone), cultural, social, linguistic, economic, political, or historical linkages. [175] The term Nearshoring is a derivative of the business term offshoring. The hybrid term "nearshore outsourcing" is sometimes used as an alternative for nearshoring, since nearshore workers are not employees of the company for which the work is performed. It can also be a reversal, by contracting a development partner in a different country but in close proximity (same or nearby time zone), facilitating communication and allowing frequent visits. This is a business strategy to place some or all of its operations close to where its products are sold. Typically, this is contrasted with the trend to outsource low-wage manufacturing operations to developing nations (offshoring), and reflects a reversal of that trend. Sometime, the work is done by an outside contracted company rather than internally (insourcing), but unlike offshore outsourcing, the work is done in fairly close proximity to either the company headquarters or its target market.

    In Europe, nearshore outsourcing relationships are between clients in larger European economies and various providers in smaller European nations.

    Major centers are Poland, Ukraine, Belarus, Czech Republic, Romania, Bulgaria, Hungary, Portugal, Slovakia, and the Baltic. The attraction is lower-cost skilled labor forces, and a less stringent regulatory environment, but crucially they allow for more day to day physical oversight. These countries also have strong cultural ties to the major economic centers in Europe as they are part of EU. For example, as of 2020 Portugal is considered to be the most trending outsourcing destination [176] as big companies like Mercedes, Google, [177] Jaguar, Sky News, Natixis and BNP Paribas opening development centers in Lisbon and Porto, where labor costs are lower, talent comes from excellent Universities, there's availability of skills and the time zone is GMT (the same as London). [178]

    In the US, American clients nearshore to Canada and Mexico, as well as to many nations in Central and South America.

    Reasons to nearshore Edit

    Culture Edit

    Cultural alignment with the business is often more readily achieved through near-sourcing due to there being similarities between the cultures in which the business is located and in which services are sub-contracted, including for example proficiency with the language used in that culture. [79]

    Communication Edit

    Constraints imposed by time zones can complicate communication near-sourcing or nearshoring offers a solution. English language skills are the cornerstone of Nearshore and IT services. Collaboration by universities, industry, and government has slowly produced improvements. Proximity also facilitates in-person interaction regularly and/or when required. [179] [180] [181]

    Other Advantages Edit

    Software development nearshoring is mainly due to flexibility when it comes to upscale or downscale [182] teams or availability of low cost skilled developers. The nearshoring of call centers, shared services centers, and (Business Process Outsourcing) rose as offshore outsourcing was seen to be relatively less valuable.

    The complexities of offshoring stem from language and cultural differences, travel distances, workday/time zone mismatches, and greater effort for needed for establishing trust and long-term relationships. Many nearshore providers attempted to circumvent communication and project management barriers by developing new ways to align organizations. As a result, concepts such as remote insourcing were created to give clients more control in managing their own projects. Nearshoring still hasn't overcome all barriers, but proximity allows more flexibility to align organizations. [183]

    The United States has a special visa, the H-1B, [184] which enables American companies to temporarily (up to three years, or by extension, six) hire foreign workers to supplement their employees or replace those holding existing positions. In hearings on this matter, a United States senator called these "their outsourcing visa." [185]

    • In 2003 Procter & Gamble outsourced their facilities' management support, but it did not involve offshoring. [186] offshored to India in 2001 but reversed since "customers were not happy with the prior arrangement . " [11]

    Print and mail outsourcing is the outsourcing of document printing and distribution.

    The Print Services & Distribution Association was formed in 1946, and its members provide services that today might involve the word outsource. Similarly, members of the Direct Mail Marketing Association were the "outsourcers" for advertising agencies and others doing mailings. DMMA celebrated their 100th anniversary in 2017.

    The term "outsourcing" became very common in the print and mail business during the 1990s, and later expanded to be very broad and inclusive of most any process by the year 2000. Today, there are web based print to mail solutions for small to mid-size companies which allow the user to send one to thousands of documents into the mail stream, directly from a desktop or web interface. [187]

    The term outsource marketing has been used in Britain to mean the outsourcing of the marketing function. [188] The motivation for this has been:

      . [189][190]
  • specialized expertise. [191]
  • speed of execution
  • short term staff augmentation [192]
  • While much of this work is the "bread and butter" of specialized departments within advertising agencies, sometimes specialist are used, such as when the Guardian newspaper outsourced most of their marketing design in May 2010. [193]

    Business process outsourcing (BPO) is a subset of outsourcing that involves the contracting of the operations and responsibilities of a specific business process to a third-party service provider. Originally, this was associated with manufacturing firms, such as Coca-Cola that outsourced large segments of its supply chain. [194]

    BPO is typically categorized into back office and front office outsourcing. [195]

    BPO can be offshore outsourcing, near-shore outsourcing to a nearby country, or onshore outsource to the same country. Information Technology Enabled Service (ITES-BPO), [196] Knowledge process outsourcing (KPO) and Legal process outsourcing (LPO) are some of the sub-segments of BPO.

    Although BPO began as a cost-reducer, changes (specifically the move to more service-based rather than product-based contracts), companies now choose to outsource their back-office increasingly for time flexibility and direct quality control. [197] Business process outsourcing enhances the flexibility of an organization in different ways:

    BPO vendor charges are project-based or fee-for-service, using business models such as Remote In-Sourcing or similar software development and outsourcing models. [198] [199] This can help a company to become more flexible by transforming fixed into variable costs. [200] A variable cost structure helps a company responding to changes in required capacity and does not require a company to invest in assets, thereby making the company more flexible. [201]

    BPO also permits focusing on a company's core competencies. [202]

    Supply chain management with effective use of supply chain partners and business process outsourcing can increase the speed of several business processes. [203]

    BPO caveats Edit

    Even various contractual compensation strategies may leave the company as having a new "single point of failure" (where even an after the fact payment is not enough to offset "complete failure of the customer's business"). [204] Unclear contractual issues are not the only risks there's also changing requirements and unforeseen charges, failure to meet service levels, and a dependence on the BPO which reduces flexibility. The latter is called lock-in flexibility may be lost due to penalty clauses and other contract terms. [205] Also, the selection criteria may seem vague and undifferentiated [206]

    Security risks can arise regarding both from physical communication and from a privacy perspective. Employee attitude may change, and the company risks losing independence. [207] [208]

    Risks and threats of outsourcing must therefore be managed, to achieve any benefits. In order to manage outsourcing in a structured way, maximising positive outcome, minimising risks and avoiding any threats, a business continuity management (BCM) model is set up. BCM consists of a set of steps, to successfully identify, manage and control the business processes that are, or can be outsourced. [209]

    Analytical hierarchy process (AHP) is a framework of BPO focused on identifying potential outsourceable information systems. [210] L. Willcocks, M. Lacity and G. Fitzgerald identify several contracting problems companies face, ranging from unclear contract formatting, to a lack of understanding of technical IT processes. [211]

    Technological pressures Edit

    Industry analysts have identified robotic process automation (RPA) software and in particular the enhanced self-guided RPAAI based on artificial intelligence as a potential threat to the industry [212] [213] and speculate as to the likely long-term impact. [214] In the short term, however, there is likely to be little impact as existing contracts run their course: it is only reasonable to expect demand for cost efficiency and innovation to result in transformative changes at the point of contract renewals. With the average length of a BPO contract being 5 years or more [215] – and many contracts being longer – this hypothesis will take some time to play out.

    On the other hand, an academic study by the London School of Economics was at pains to counter the so-called "myth" that RPA will bring back many jobs from offshore. [216] One possible argument behind such an assertion is that new technology provides new opportunities for increased quality, reliability, scalability and cost control, thus enabling BPO providers to increasingly compete on an outcomes-based model rather than competing on cost alone. With the core offering potentially changing from a "lift and shift" approach based on fixed costs to a more qualitative, service based and outcomes-based model, there is perhaps a new opportunity to grow the BPO industry with a new offering.

    Industry size Edit

    One estimate of the worldwide BPO market from the BPO Services Global Industry Almanac 2017, puts the size of the industry in 2016 at about US$140 billion. [217]

    India, China and the Philippines are major powerhouses in the industry. In 2017, in India, the BPO industry generated US$30 billion in revenue according to the national industry association. [218] The BPO industry is a small segment of the total outsourcing industry in India. The BPO industry workforce in India is expected to shrink by 14% in 2021. [219]

    The BPO industry and IT services industry in combination are worth a total of US$154 billion in revenue in 2017. [220] The BPO industry in the Philippines generated $22.9 billion in revenues in 2016, [221] while around 700 thousand medium and high skill jobs would be created by 2022. [222]

    In 2015, official statistics put the size of the total outsourcing industry in China, including not only the BPO industry but also IT outsourcing services, at $130.9 billion. [223]


    Historical Hurricane Tracks - GIS Map Viewer

    Tracking and understanding hurricanes is important to scientists and climatologists who seek to find patterns and variability as a piece in understanding climate change. For emergency management officials and and those who live in a possible hurricane strike area, it is imperative to study and be aware of the patterns and possibilities in order to prevent loss of life.

    How were these data collected?

    Data for more than 6000 global tropical cyclones are included in the IBTrACS database, spanning the last 150 years. For each storm, position, sustained winds, and minimum central pressure data points are collected. The Hurricane Tracker uses Adobe’s Flex Viewer with ESRI’s ArcGIS Server to provide interactive capability.

    What can I do with these data?

    This site gives users a way to select, display, and share historical hurricane tracks and other information about tropical cyclones recorded by various methods. Using extensive selection and sorting criteria, users can display information from individual storms or the set of storms that crossed a specified location. Wind speed and pressure readings are also available. By “sharing” search results, the track and data can be saved to a link.

    How do I use the site?

    Use the pull-down menus to set options:

    Once options have been set, further refinements can be made to the selections to give specific storm data.