4768 results:
Summary: The "Investing in Tomorrow's Workforce Act of 2023" aims to enhance training programs for workers impacted by automation, addressing job loss and improving workforce adaptability in a technology-driven economy.
Collection: Congressional Record
Status date: Sept. 5, 2023
Status: Issued
Source: Congress
Societal Impact
Data Governance (see reasoning)
The text primarily discusses the impact of automation on the workforce and addresses the need for training and support for workers likely to be displaced by technological advancements. It mentions 'automation' as a significant factor affecting job markets and discusses grants for training in technology-related sectors. This has a direct bearing on social impact as it relates to economic disparities caused by automation and the necessity to prepare vulnerable workers. The emphasis on training and addressing the needs of impacted populations aligns with the goals of ensuring that technology does not perpetuate inequality. The legislation also hints at the need for robust data governance and integrity measures by mentioning accuracy in reporting worker transitions and outcomes from training programs. While it discusses automation broadly, it doesn’t delve deeply into the integrity or robustness of AI systems specifically, leading to lower scores in those categories. Overall, it has strong relevance to social impact due to its focus on workers affected by automation, moderately relevant data governance elements, and minimal relevance for other categories.
Sector:
Government Agencies and Public Services
Private Enterprises, Labor, and Employment (see reasoning)
The bill explicitly addresses the needs of workers affected by automation, which falls under the Private Enterprises, Labor, and Employment sector as it discusses the intersection of automated technologies with employment trends. It also pertains to Government Agencies and Public Services since it outlines the role of the federal government in funding training programs and supporting impacted workers. The legislation does not have a significant focus on the judicial system, healthcare, or other specified sectors, leading to lower scores in those areas. Overall, the primary sectors of relevance are Private Enterprises, Labor, and Employment, and Government Agencies and Public Services, given the focus on workforce training related to technology integration.
Keywords (occurrence): autonomous vehicle (1) show keywords in context
Summary: The bill establishes procedures for dose reconstruction methodology related to radiation exposure for personnel involved in nuclear tests, including types of radiation and calculations for accurate dose estimates.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The provided text focuses primarily on radiation dose reconstruction methodology, which does not encompass AI-related topics or technologies. While it includes references to automated procedures for data handling and integration, these mentions are related to traditional data processing rather than AI-specific systems or algorithms. Thus, none of the categories (Social Impact, Data Governance, System Integrity, Robustness) are adequately met, as there is no significant discussion or mention of AI applications, implications, or frameworks relevant to these legislative categories.
Sector: None (see reasoning)
The text also does not relate to any predefined sectors as it solely discusses procedures and methodologies related to radiation dose calculation. There is no reference to sectors that would indicate the application of AI within Politics and Elections, Government Agencies and Public Services, Judicial System, Healthcare, Private Enterprises, Labor, and Employment, Academic and Research Institutions, International Cooperation and Standards, Nonprofits and NGOs, or the Hybrid, Emerging, and Unclassified sectors. Consequently, all scores are rated as not relevant.
Keywords (occurrence): automated (1) show keywords in context
Summary: The bill addresses the Pentagon's readiness to deter and defeat adversaries through a congressional hearing focused on military innovation, technology adoption, and strategic reforms to enhance national defense capabilities.
Collection: Congressional Hearings
Status date: Feb. 9, 2023
Status: Issued
Source: House of Representatives
Societal Impact
System Integrity
Data Robustness (see reasoning)
The relevant segments of the hearing text reference technologies like artificial intelligence, autonomous systems, and neural networks within the context of modern warfare and defense strategies. These discussions primarily highlight how AI and similar technologies are reshaping military tactics and capabilities, fitting them into the national security landscape. Given this focus, there's a significant connection between the text's content and the social impact of AI in warfare, as well as the integrity and robustness of systems being used for defense, thus justifying higher relevance scores across categories.
Sector:
Government Agencies and Public Services
International Cooperation and Standards
Hybrid, Emerging, and Unclassified (see reasoning)
The provided text discusses the use of AI and emerging technologies broadly in a military context. It doesn't address specific legal aspects of the judicial system, healthcare, or regulations governing private enterprises directly. However, it references AI's application in defense and military strategies, which is crucial for understanding the government sector's engagement with emerging technologies. As such, the most substantial relevance lies within government operations and military readiness, hence higher scores for the Government Agencies and Public Services sector, with marginal relevance for others.
Keywords (occurrence): artificial intelligence (3) show keywords in context
Summary: The bill addresses significant healthcare workforce shortages in the U.S., proposing solutions to increase the number of healthcare providers and improve care access, particularly in underserved areas.
Collection: Congressional Hearings
Status date: Feb. 16, 2023
Status: Issued
Source: Senate
Societal Impact (see reasoning)
The text discusses a hearing focused on healthcare workforce shortages, emphasizing the need for improved access to services, which indirectly connects to the implications of AI in enhancing service delivery and addressing human resource gaps. However, there is no specific mention of AI, algorithms, or related technologies, limiting its direct relevance to the Social Impact category in terms of AI-driven methods. Despite this, the category still aligns moderately due to the overarching health care issues exacerbated by workforce limitations that could benefit from AI solutions. The Data Governance category is not applicable as the text does not address data collection, management, or bias specific to AI systems. System Integrity is similarly not a fit as it does not discuss security, transparency, or oversight of AI systems in detail. Robustness is also not directly relevant since there are no benchmarks or performance standards for AI discussed in the context of healthcare workforce shortages.
Sector:
Healthcare (see reasoning)
The legislation touches on health care workforce shortages, making it highly relevant to the Healthcare sector. It highlights the impact of shortages on patient care and access, which are immediate concerns within healthcare services that could see improvements through potential AI applications. Other sectors such as Politics and Elections, Government Agencies and Public Services, and Private Enterprises, Labor, and Employment, while tangentially related, do not feature prominently in the text. The focus remains firmly on healthcare and the challenges therein, with no substantial references to regulation in political campaigns or AI's effects in public sector operations. Therefore, the Healthcare sector gains a high relevance score, while others score lower as they do not apply.
Keywords (occurrence): artificial intelligence (1) show keywords in context
Summary: The bill outlines minimum internal control standards for Class II gaming on Indian lands, detailing definitions, compliance requirements, and various control standards to ensure gaming integrity and accountability.
Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily discusses regulations and compliance standards for gaming operations, especially concerning internal controls and definitions relevant to Class II gaming. It does not explicitly mention or pertain to any aspects of AI technology. The terms related to AI such as algorithms, automated systems, or intelligent decision-making do not appear, nor does the context suggest relevance to the use of AI in gaming operations or surveillance technology within this regulatory framework. Therefore, it seems to be unrelated to the categories concerning Social Impact, Data Governance, System Integrity, or Robustness.
Sector: None (see reasoning)
The text outlines regulations for Class II gaming on Indian lands, focusing on operational standards and definitions of various gaming-related terms. Its primary context revolves around the gaming industry, compliance measures, and accounting standards, which do not directly involve the nine sectors defined. No references to the political implications, public services, healthcare applications, or use of AI in judicial systems are evident. Hence it does not fit into any sector categorically and therefore receives a score of 1 for all sectors.
Keywords (occurrence): automated (2) show keywords in context
Summary: The bill exempts the Drug Enforcement Administration and Immigration and Naturalization Service records from specific Privacy Act provisions to enhance law enforcement capabilities, ensuring investigations remain confidential and effective.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
Data Governance (see reasoning)
The text discusses the exemptions related to the collection and management of law enforcement data, focusing on maintaining the confidentiality of investigations and protecting the identities of informants. While it does mention an 'Automated Intelligence Record System,' the context does not indicate a significant engagement with the social implications or accountability associated with AI, nor does it focus on data governance, system integrity or robustness of AI systems specifically. Therefore, relevance is low but there are elements of data management with respect to privacy, indicating slight relevance.
Sector:
Government Agencies and Public Services (see reasoning)
This document primarily pertains to law enforcement operations, particularly those involving information management systems utilized by federal agencies like the Drug Enforcement Administration and Immigration and Naturalization Service. It addresses practices that may not directly relate to AI but do concern the broader implications of how automated intelligence systems are applied in the enforcement context. There's no direct reference to how AI might influence politics or sectors outside of law enforcement. Nonetheless, some principles may be applicable to government operations, but the overall connection remains weak.
Keywords (occurrence): automated (1)
Summary: The bill allocates funding to the Department of Homeland Security for fiscal year 2023, focusing on combating human trafficking and enhancing victim support programs.
Collection: Congressional Hearings
Status date: March 2, 2023
Status: Issued
Source: Senate
The text does not explicitly focus on AI systems or their impact. It primarily discusses appropriations and efforts related to combating human trafficking and funding requests for various programs within the Department of Homeland Security. While there may be implied references to data and processes relevant to public safety, there are no direct mentions of AI technology, algorithms, or other related terms that would support its categorization under the defined categories. Therefore, it does not meet the thresholds for relevance in Social Impact, Data Governance, System Integrity, or Robustness.
Sector: None (see reasoning)
The text relates solely to appropriations for the Department of Homeland Security and does not discuss the use of AI in any of the specified sectors, such as politics, healthcare, or education. It focuses on budget requests to combat human trafficking and enhance public safety, which may indirectly touch on sectors like Government Agencies and Public Services but does not involve AI technologies or regulatory considerations tied to these sectors. This lack of explicit AI context results in a score of 1 across all sectors.
Keywords (occurrence): automated (1) show keywords in context
Summary: The bill relates to visibility protection in Arizona, addressing air quality deterioration. It mandates approval of procedures to safeguard visibility in federally designated areas and incorporates specific regulations for emissions from local sources.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The provided text from the Environmental Protection Agency primarily focuses on regulations regarding air quality and emissions, particularly in relation to stationary sources in Arizona. It does not mention topics directly related to Artificial Intelligence (AI), such as automated decision-making or algorithms, which are crucial for the relevance of the categories outlined. Given the absence of references to AI technologies or implications regarding their application in environmental regulations, none of the categories related to social impact, data governance, system integrity, or robustness can be substantiated as relevant to this legislation.
Sector: None (see reasoning)
Similarly, the text does not reference AI applications or regulations in any of the sectors. It focuses purely on air quality regulations without engaging with the complexities of AI in sectors such as healthcare, government services, or any other mentioned. The lack of direct relevance to AI ensures that all sector categorizations must receive a low relevance score.
Keywords (occurrence): automated (4) show keywords in context
Summary: This bill establishes guidelines for the TSCA mammalian erythrocyte micronucleus test, which detects chromosome damage in mammals to assess mutagenic hazards from substances.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily outlines the TSCA mammalian erythrocyte micronucleus test, which is focused on cytogenetic damage detection rather than on the implications or impacts of AI technology. It does not directly engage with themes traditionally associated with AI regulation, such as fairness, bias, or misinformation. Thus, it lacks significant relevance to the categories of Social Impact, Data Governance, System Integrity, and Robustness, which all involve aspects of AI's role in society, the ethical handling of data, or the robustness and security of AI systems.
Sector: None (see reasoning)
The text does not relate to specific sectors defined under the use and regulation of AI. The outlined testing methodology is clearly within the realm of toxicology and genetic impact assessment, which does not intersect with political activities, healthcare, AI's role in business, or academic research. Therefore, it is deemed irrelevant to all listed sectors.
Keywords (occurrence): automated (2) show keywords in context
Summary: The bill outlines recordkeeping requirements for employers under the Family and Medical Leave Act (FMLA), detailing penalties for non-compliance and ensuring employee leave records are maintained and accessible.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text concerns recordkeeping requirements under the Family and Medical Leave Act (FMLA), primarily focusing on obligations of employers to maintain employee records. The explicit mention of 'automated data processing memory' hints at potential use of AI or automated systems for record management, but does not delve into broader implications or applications of AI. Thus, none of the categories directly apply with significance as the focus is more on regulatory compliance than on AI impact, governance, integrity, or robustness. However, there is a marginal relevance to Data Governance regarding the protocols on recordkeeping, given the management of personal employee data and potential for automation in these processes. This relevance is minimal and stems largely from the mention of automated data systems rather than a direct focus on data management or governance practices.
Sector:
Private Enterprises, Labor, and Employment (see reasoning)
The text does not discuss any specific use of AI within sectors such as politics, healthcare, or any other sectors listed. It is primarily a regulatory text that deals with recordkeeping mandates for employers under the FMLA, without specific mention of AI applications or impacts in the specific contexts. The closest connection stems from the reference to electronic records in an employment context, but this is not substantial enough to tie it to a specific sector for meaningful categorization.
Keywords (occurrence): automated (1) show keywords in context
Summary: The bill mandates disclosure of insider trading arrangements and policies by registrants, detailing practices for hedging and trading by employees, officers, and directors to ensure compliance with securities laws.
Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily outlines rules and regulations concerning insider trading, specifically detailing how registrants should disclose practices related to hedging and insider trading policies. It also mentions algorithms and computer programs intended for trading arrangements, indicating some level of automation. However, there is no substantive discussion of AI, its social impact, data governance, system integrity, or performance benchmarks. Given the lack of direct relevance to AI's broader societal implications, data management practices, AI system security and transparency, or performance standards, this text does not significantly pertain to any of the categories defined.
Sector: None (see reasoning)
The text focuses on insider trading regulations, which do not classify under any specific sector concerning AI, politics, government services, judicial systems, healthcare, or others described. It does introduce terms like 'algorithm' and 'automated trading arrangements', but they relate more to financial regulations than to the sectors defined. Consequently, the content is not relevant enough to any specific sector in the context of AI application or regulation.
Keywords (occurrence): automated (1) show keywords in context
Summary: The bill outlines requirements for recording and maintaining product testing results for railroads, including specific documentation protocols and the use of electronic recordkeeping. Its purpose is to ensure safety and accountability in railroad equipment testing and maintenance.
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Data Governance (see reasoning)
The text primarily discusses the procedural requirements for recording product testing results and maintaining records related to railroad safety. The references to 'automated tracking systems' and 'electronic recordkeeping' suggest a focus on technology in record maintenance, but there is no specific mention of AI-related technologies or their implications on society, data governance, system integrity, or robustness. Therefore, while there is mention of automation in terms of tracking systems, it does not strongly connect to AI algorithms or decision-making processes that would elevate its relevance to any of the categories significantly. Hence, the text is somewhat relevant in the context of Data Governance regarding electronic records, but not significantly enough for the other categories.
Sector: None (see reasoning)
The text relates to rail safety and maintenance protocols, with mentions of automated systems used for tracking and recording tests. However, it does not specifically focus on the implications or applications of AI within any governmental sector, industry, or social structure. The focus seems to remain on railroad safety standards and recordkeeping without direct implications for the specified sectors. As such, there is only a minimal relevance to Government Agencies and Public Services, due to the involvement of tracking systems which may impact public safety indirectly.
Keywords (occurrence): automated (6) show keywords in context
Summary: The bill establishes regulations for on-shift examinations in mines, requiring certified personnel to assess safety conditions, monitor air quality, and verify compliance with health standards to enhance miner safety.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text focuses predominantly on safety regulations and procedures specific to mining operations. While it mentions methods of examination and monitoring for hazardous conditions, which could imply some automated systems, there is no explicit mention of AI, algorithms, or related concepts. Therefore, it does not squarely fit within the categories that deal with AI's social impact, governance, integrity, or robustness as there is a lack of reference to AI technologies or their implications within those contexts.
Sector: None (see reasoning)
The text appears to relate to mining safety protocols and does not address any specific application of AI within the mining industry, nor does it touch upon how AI may influence the procedures outlined. Thus, each sector receives a score of 1 due to absence of relevance to AI applications in that specific sector.
Keywords (occurrence): automated (1) show keywords in context
Summary: The bill establishes a cost recovery fee program for Community Development Quota (CDQ) groups managing groundfish and halibut, detailing application processes and payment responsibilities to ensure compliance.
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text provided largely discusses regulations and applications regarding CDQ (Community Development Quota) programs related to fishing operations. It mostly focuses on administrative processes, obligations of CDQ group representatives, fee collection, determination of standard prices, and reporting. There are no explicit mentions or implications of AI-related technology within the text. Therefore, relevance to the categories of Social Impact, Data Governance, System Integrity, and Robustness is minimal to nonexistent. None of the categories apply meaningfully since AI is not engaged with the content, nor does it address concerns such as data management or societal impact through AI systems.
Sector: None (see reasoning)
This text does not deal with any of the specified sectors directly. It focuses on fishing regulations and the management of the CDQ program rather than the use or regulation of AI in any sector like Politics and Elections, Government Agencies and Public Services, Healthcare, etc. As such, all sector scores are also scored as not relevant.
Keywords (occurrence): automated (1) show keywords in context
Summary: The bill establishes regulations for time-sharing satellite communication in the 400.15-401 MHz band between Department of Defense meteorological satellites and non-voice, non-geostationary mobile satellite systems to prevent harmful interference.
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Data Governance
System Integrity (see reasoning)
This text primarily deals with the technical regulations governing the use of frequency bands for satellite communications, specifically the interaction between non-voice, non-geostationary mobile satellite systems (NVNG MSS) and Department of Defense (DoD) satellites. It doesn’t discuss the broader social implications or ethical considerations of AI; thus, the Social Impact category scores low. Data Governance is somewhat relevant due to the mention of system compliance and operational communication pertaining to satellite performance, but it does not focus explicitly on data governance principles or practices, receiving a moderate score. System Integrity is relevant due to the requirements for operational security and the capability for immediate shutdown of satellite transmissions, reflecting concerns about system security and oversight. Robustness is not notably applicable in this context as there are no explicit references to benchmarks for performance or standards of auditing related to AI technologies. Overall, the absence of AI-specific references significantly lowers the scores across all categories.
Sector: None (see reasoning)
The text primarily concerns the regulation of satellite communications and does not delineate or emphasize AI usage in specific sectors such as politics, healthcare, or others. While there is a mention of operational compliance, it does not pertain to the judicial system or to government services directly in a way that links AI applications. As it discusses interactions related to the Department of Defense, it could potentially touch on Military applications, but this is not explicitly stated. The document lacks a strong connection to most of the sectors outlined, resulting in low overall scores across the board.
Keywords (occurrence): algorithm (2) show keywords in context
Summary: The bill outlines the retained authorities of the EPA Administrator regarding hazardous medical waste incinerators, including approval of compliance methods and alternatives, ensuring environmental regulations are maintained.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily pertains to regulations regarding hazardous waste incineration and the authorities retained by the EPA administrator. There is no explicit mention or relevance of AI technologies, algorithms, or any aspects that would fall under the categories of Social Impact, Data Governance, System Integrity, or Robustness. The focus is entirely on environmental regulations and compliance standards. Therefore, it is not relevant for any of the AI-related categories.
Sector: None (see reasoning)
The text addresses specific administrative powers of the EPA with respect to hazardous waste incineration. There is no relevance to any sector involving AI, such as Politics and Elections, Government Agencies and Public Services, etc. The content does not touch on the usages of AI in any of the listed sectors, making it irrelevant as well.
Keywords (occurrence): automated (1) show keywords in context
Summary: The bill establishes safety and mechanical requirements for passenger train brake systems, ensuring effective operation, emergency functionality, and protection against thermal damage to enhance passenger safety.
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text pertains primarily to the technical specifications and operational requirements of train brake systems. It does not explicitly mention AI or related concepts such as algorithms, machine learning, or automated decision-making systems. Therefore, it is not relevant to any of the AI-related categories, which focus on the societal impacts, data governance, system integrity, and robustness of AI technologies. The content is strictly about mechanical and safety systems within the context of railroad regulations, suggesting a score of 1 (not relevant) across all categories.
Sector: None (see reasoning)
The text also lacks direct relevance to any specific sectors related to AI. It speaks to regulations about train systems, which fall outside the scope of politics, government, healthcare, and other predefined sectors dealing with AI applications. The focus is strictly on operational safety and design standards for braking systems in passenger trains. Thus, it results in a score of 1 (not relevant) for all sectors.
Keywords (occurrence): automated (1) show keywords in context
Summary: The bill establishes effluent limitations and pretreatment standards for wastewater discharges from new sources and liquid detergents manufacturing, aiming to regulate pollution and protect water quality.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily focuses on environmental and effluent limitations related to the manufacture of liquid detergents. It does not contain specific references to artificial intelligence (AI) or related technologies. While there is mention of automated fill lines, this refers to operational processes rather than AI systems. Therefore, the text does not sufficiently address issues related to the social impact of AI, data governance, system integrity, or robustness. As such, all four categories are deemed not relevant to the content provided.
Sector: None (see reasoning)
The text mainly deals with regulatory standards for effluent discharges in manufacturing processes and does not reference any specific sectors such as politics, healthcare, or education in relation to AI. The mention of automated processes does not imply a broad application of AI in these sectors. Therefore, all sectors are considered not relevant to the content of the provided legislation.
Keywords (occurrence): automated (3) show keywords in context
Summary: The bill establishes vetting requirements for key individuals in indefinite delivery contracts to ensure compliance with USAID standards, enhancing accountability and security in government-funded projects.
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text mainly discusses the partner vetting process in indefinite delivery contracts, including requirements for key individuals involved and the mechanisms for their vetting. There are no explicit mentions or direct implications relating to AI technologies such as algorithms, machine learning, or automated decisions. Consequently, relevance to social impact, data governance, system integrity, and robustness is weak as the text does not address governance of data in AI systems, nor does it illustrate any AI-driven impacts on society, data management, system security, or performance benchmarks specific to AI. Therefore, the overall relevance to AI-related categories is minimal, warranting lower scores.
Sector: None (see reasoning)
The text predominantly relates to contract management and procurement processes within the context of USAID, focusing on vetting processes rather than the application or management of AI. The lack of AI relevance means it does not fit strongly into any of the defined sectors such as politics and elections, government agencies, or healthcare. There is limited reference to the use or implications of AI in the performance of governmental or public service activities, judicial practices, healthcare, or labor dynamics, thus scoring low across all sectors.
Keywords (occurrence): automated (1)
Summary: This bill outlines the procedure for assessing the capture efficiency of emissions from automotive spray booths using panel testing, aiming to ensure effective control of volatile organic compounds (VOCs) in solvent-borne coatings.
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily discusses the technical standards and procedures related to emissions capture in automotive spray booths, focusing specifically on the measurement and control of volatile organic compounds (VOCs), rather than AI. It does not reference artificial intelligence or related technologies explicitly or implicitly, and it is primarily aimed at environmental compliance for emissions control. Thus, the relevance to the provided categories is minimal. None of the sections mention issues related to social impact, data governance, system integrity, or robustness in the context of AI technology.
Sector: None (see reasoning)
The text does not pertain explicitly to any sector relevant to AI. It focuses on emission control in automotive spray booths and environmental standards, rather than AI applications across the specified sectors. There are no references to political usage of AI, government services leveraging AI, or AI implications in healthcare. Given that the text centers on emissions regulation without any intersection with AI sectors, it holds no relevance to any of the specified areas.
Keywords (occurrence): automated (1) show keywords in context