Learning Outcomes for M.S. in Applied Data Science with a Specialization in Crisis Informatics
Crisis Informatics Specialization
- Evaluate the role of the Department of Homeland Security (DHS) and the Federal Emergency Management Agency (FEMA) and the challenges they face.
- Evaluate how legislation and regulations have influenced FEMA and DHS and their development.
- Evaluate the lessons learned from past disasters and how this has influenced current disaster management.
- Assess natural, human caused, and technological hazards as they relate to a community’s vulnerabilities and capabilities.
- Analyze the influence of social constructs within each of the four phases of emergency management.
- Evaluate a comprehensive emergency management plan (CEMP) as it relates to the community’s level of emergency preparedness.
- Analyze the interdependency among government (local, state, and federal) and nongovernmental organizations during each phase of emergency management.
- Differentiate emergency management strategies in the United States from those of other countries, both developed and developing, and the role of the U.S. globally.
- Find and access selected sources of GIS data.
- Query and select data by attribute and location.
- Perform operations associated with geographic profiling.
- Create new GIS data by applying a variety of techniques to map locations of buildings and events by geocoding address data.
- Discover and explain patterns by analyzing data using buffers, spatial overlays, hot-spot analysis, density surface maps, and other techniques.
- Analyze data to identify hot spots.
- Create a variety of finished maps, suitable for presentation, including thematic maps and density surface maps.
- Evaluate an application of GIS to a particular field for public safety and crisis management.
- Synthesize different sources and types of information to make recommendations.
- Incorporate knowledge and skills of geographic information systems into practice and research.
- Extract information from sources of remotely sensed data by various software-based methods and interpret the information visually.
- Apply remote sensing methods in a variety of contexts, such as measuring biophysical characteristics of the Earth’s surface and human impact on the environment.
- Assess critically the strengths and weaknesses of different remote sensing systems for a variety of applications.
- Develop remote sensing workflows to solve problems in a variety of application areas.
- Solve problems with appropriate remote sensing data and processing methods.
- Communicate findings from the analysis of remotely sensed data clearly and concisely through written and graphical products.
- Frame research questions or hypotheses motivated by a geographic or other kind of spatial problem.
- Design an experiment or other quantitative procedure for spatial knowledge discovery.
- Collect and analyze spatially relevant data and draw empirically supported conclusions.
- Communicate by means of a report research findings to an audience in the spatial sciences.
- Compare the purpose, capabilities, and limitations of geospatial hazard models.
- Evaluate the impact of a disaster on the natural, built, and social environment by appropriately selecting and analyzing data sources.
- Assess the validity of model output.
- Formulate questions to guide the development of geospatial models suitable for addressing hazard risk.
- Differentiate between requirements for models that estimate risk and those that describe the impact of events.
- Evaluate the relative effectiveness of model visualizations, such as dashboards, maps, charts, 3D graphics, and animation, for communicating to a given audience.
Master of Science in Applied Data Science Core
Students will demonstrate competency in data analytics.
- Differentiate between research fields, theoretical concepts, epistemologies, and qualitative and quantitative methods.
- Analyze critically and speak publicly about field-specific scholarly research, projects executed in class, and data management issues.
- Design, implement, test, and debug extensible and modular programs involving control structures, variables, expressions, assignments, I/O, functions, parameter passing, data structures, regular expressions, and file handling.
- Apply software development methodologies to create efficient, well-structured applications that other programmers can easily understand.
- Analyze computational complexity in algorithm development.
- Investigate research questions and designs by loading, extracting, transforming, and analyzing data from various sources.
- Test hypotheses and evaluate reliability and validity.
- Implement histograms, classifiers, decision trees, sampling, linear regression, and projectiles in a scripting language.
- Decompose and simulate systems to process data using randomness.
- Employ supervised and unsupervised machine learning for functional approximation and categorization.
- Display, interpret, and explore data using descriptive statistics and graphs.
- Explore assumptions about the data, including normality, skew, and kurtosis.
- Use random variables and probability distributions.
- Determine whether and how to perform statistical inference.
- Perform parametric (e.g., t-test, ANOVA, ANCOVA, MANOVA) and nonparametric (e.g., chi-square) hypothesis testing and correlation.
- Fit linear regression models and interpret their parameters.
- Analyze datasets with supervised learning methods for functional approximation, classification, and forecasting and unsupervised learning methods for dimensionality reduction and clustering.
- Explore, transform, and visualize large, complex datasets with graphs in R.
- Solve real-world problems by adapting and applying statistical learning methods to large, complex datasets.
- Identify, assess, and select among statistical learning methods and models for solving a particular real-world problem, weighing their advantages and disadvantages.
- Write programs to perform data analytics on large, complex datasets in R.
- Analyze datasets from case studies in informatics-related fields (e.g., digital media, human-computer interaction, health informatics, bioinformatics, and business intelligence).
Students will demonstrate competency in data management, infrastructure, and the data science life cycle.
- Design and implement relational databases using tables, keys, relationships, and SQL commands to meet user and operational needs.
- Diagram a relational database design with entity–relationship diagrams (ERDs) using crow’s foot notation to enforce referential integrity.
- Evaluate tables for compliance to third normal form and perform normalization procedures on noncompliant tables.
- Write triggers to handle events and enforce business rules and create views within a relational database.
- Demonstrate an understanding of the data lifecycle, including data curation, stewardship, preservation, and security.
- Evaluate the social and ethical implications of data management.
Students will demonstrate competency in client–server application development.
- Design and implement client–server applications that solve real-world problems.
- Create well-formed static and dynamic webpages using current versions of PHP, HTML, CSS, and JavaScript or their equivalents.
- Implement the model-view-controller software pattern in web and mobile user interfaces.
- Apply client-side and server-side programming skills including design, coding, implementation, and integration with relational databases.
- Extract data from JavaScript Object Notation (JSON) and Extensible Markup Language (XML) documents.
- Transmit objects between the browser and server by converting them into JSON.
- Evaluate a given web application based on different criteria such as structure, dynamics, security, embedded systems, and interactivity.
- Diagram the phases of the secure software development lifecycle.
- Demonstrate the techniques of defensive programming and secure coding.
- Design user-friendly web and mobile interfaces.
Students will demonstrate competency in the management of massive, high-throughput data stores, and cloud computing.
- Research the main concepts, models, technologies, and services of cloud computing, the reasons for the shift to this model, and its advantages and disadvantages.
- Examine the technical capabilities and commercial benefits of hardware virtualization.
- Analyze tradeoffs for data centers in performance, efficiency, cost, scalability, and flexibility.
- Evaluate the core challenges of cloud computing deployments, including public, private, and community clouds, with respect to privacy, security, and interoperability.
- Create cloud computing infrastructure models.
- Demonstrate and compare the use of cloud storage vendor offerings.
- Develop, install, and configure cloud-computing applications under software-as-a-service principles, employing cloud-computing frameworks and libraries.
- Apply the MapReduce programming model to data analytics in informatics-related domains.
- Enhance MapReduce performance by redesigning the system architecture (e.g., provisioning and cluster configurations).
- Overcome difficulties in managing very large datasets, both structured and unstructured, using nonrelational data storage and retrieval (NoSQL), parallel algorithms, and cloud computing.
- Apply the MapReduce programming model to data-driven discovery and scalable data processing for scientific applications.
Students will demonstrate competency in data visualization.
- Assess the purpose, benefits, and limitations of visualization as a human-centered data analysis methodology.
- Conceptualize and design effective visualizations for a variety of data types and analytical tasks.
- Implement interactive visualizations using modern web-based frameworks.
- Evaluate critically visualizations using perceptual principles and established design guidelines.
- Conduct independent research on a range of theoretical and applied topics in visualization and visual analytics.