4161 results:
Collection: Congressional Record
Status date: Jan. 24, 2023
Status: Issued
Source: Congress
The Chance to Compete Act of 2023 primarily focuses on the reform of the civil service hiring system. While it touches on assessments and hiring processes, which could involve automated decision-making systems, there is no explicit mention of AI technologies themselves. The bill emphasizes skills- and competency-based assessments over traditional educational qualifications, hinting at algorithmic processes for evaluation. However, it does not discuss overarching impacts relevant to AI in society (Social Impact), issues surrounding data handling and governance related to AI systems (Data Governance), integrity issues in AI technologies (System Integrity), or performance benchmarks for AI (Robustness). The language does not engage with the implications of AI on societal structures nor the systemic integration of AI in hiring practices. Therefore, I would assess the relevance of each category as rather low. Each category receives a score of 1, indicating a lack of significant relevance to AI topics.
Sector: None (see reasoning)
The text does not explicitly address the utilization of AI within any specific sector, though it could tangentially relate to the use of AI in Government Agencies and Public Services through the automation of hiring assessments. Still, no direct connection is made with AI usage in public services or the implications thereof. Given the focus on traditional hiring practices rather than technology-driven processes, the text lacks sufficient detail to score higher. Consequently, all sectors receive a score of 1, indicating minimal to no relevance.
Keywords (occurrence): automated (1) show keywords in context
Collection: Congressional Record
Status date: Jan. 26, 2023
Status: Issued
Source: Congress
The text primarily consists of a list of executive communications referring various acts to the Committee on Oversight and Accountability in the District of Columbia. The acts mentioned, such as 'Automated Traffic Enforcement System Revenue Designation Amendment Act of 2022', implies some level of relevance to automated systems, which could align with discussions around AI. However, there are no explicit mentions of AI or automatic systems that would substantively relate to the defined categories regarding social impact, data governance, system integrity, or robustness. Although 'automated' could suggest an application related to AI systems, it does not elaborate on any ethical considerations or technical specifics necessary for strong category alignment. Thus, while there is a tenuous connection, it is insufficient for meaningful categorization in any of the four defined categories.
Sector: None (see reasoning)
The document details various acts that are communicated from the Chairman of the District of Columbia Council to the Committee on Oversight and Accountability. While some acts might relate to broader sectors, such as automated traffic systems potentially relevant to Public Services, the lack of explicit discussion around AI technologies in the context of Politics and Elections, Government Agencies, or any defined sector limits their relevance. Therefore, the scores reflect that there is slight relevance, but no specific sectors are sufficiently addressed in the text.
Keywords (occurrence): automated (1)
Collection: Congressional Record
Status date: Jan. 9, 2023
Status: Issued
Source: Congress
This text does not explicitly mention artificial intelligence (AI) or related technologies such as algorithms, machine learning, or automation. Instead, the focus is primarily on budgetary allocations and the operational strategies of the Internal Revenue Service (IRS). While there are mentions of technology in a broader context, such as IRS modernization efforts, this does not directly tie into the categories defined, which require a more active engagement with AI concepts. Therefore, all categories related to the impact of AI on society, data governance within AI systems, system integrity regarding AI technologies, and the robustness of AI performance benchmarks are deemed not relevant.
Sector: None (see reasoning)
The text primarily deals with the IRS and tax laws rather than the sectors related to AI application. Although there is a mention of modernization and technology in relation to IRS operations, it does not connect to how AI technologies might be employed or the implications thereof in the political, governmental, or corporate sectors. The focus remains on fiscal policy, government funding, and the practicalities of tax administration without reference to the sectors defined in relation to AI. As a result, none of the sectors are relevant, mostly reflecting on government operations rather than AI deployment or ethics.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: Jan. 1, 2024
Status: Issued
Source: Office of the Federal Register
The provided text primarily focuses on the guidelines, requirements, and definitions related to polygraph examinations within the Department of Energy and does not directly address AI systems, their impacts, governance, integrity, or robustness. It discusses aspects of counterintelligence evaluations and the rights of individuals involved, without any indication of AI or related technologies. Therefore, all categories are scored as 1, indicating not relevant.
Sector: None (see reasoning)
The text is oriented towards polygraph protocols and counterintelligence within the DOE and does not address any application or regulation of AI technologies within sectors like politics, healthcare, or public services. Thus, all sectors score a 1 for not relevant.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: Jan. 1, 2024
Status: Issued
Source: Office of the Federal Register
The text primarily consists of definitional clauses related to small business investment regulations. It does not contain explicit references to Artificial Intelligence, algorithms, or other related terminologies that would typically align it with the categories defined. There is no discussion that would indicate an impact on society, data governance, system integrity, or robustness concerning AI technologies. Thus, none of the categories are relevant to any significant degree.
Sector: None (see reasoning)
The text contains definitions relevant to small business investment and various financial instruments and does not address the use or regulation of AI within any specific sector such as politics, government, healthcare, etc. There are no indicators of AI application in any mentioned sectors. Hence, all sectors are rated as non-relevant.
Keywords (occurrence): automated (1)
Collection: Congressional Record
Status date: Dec. 13, 2023
Status: Issued
Source: Congress
The text primarily deals with legislative discussions around the National Defense Authorization Act (NDAA) and broader political issues rather than specifically addressing the social impacts of AI. It mentions border security and military aid but does not discuss AI's societal effects or related regulations. Therefore, the category Social Impact is not relevant. Data governance aspects are not addressed either, as there are no mentions of data collection, management, or biases concerning AI systems, making Data Governance not relevant. The text does not dive into system security, transparency, or control of AI systems, signaling that System Integrity is also not relevant. Finally, there are no references to benchmarks, performance standards, or oversight requirements for AI systems, leading to Robustness being considered not relevant. Overall, the text lacks direct connections to any of the AI-related categories.
Sector: None (see reasoning)
The text primarily revolves around defense and border security topics and does not reference the use of AI in any specific sectors such as politics, healthcare, or employment. There are no mentions of AI application or legislation concerning government services, judicial systems, healthcare, or academic institutions, leading to all sector categories being deemed irrelevant. The focus on legislative negotiations and military funding does not intersect with any of the AI sectors outlined, indicating a disconnect. As a result, the text does not fit into any of the provided sectors.
Keywords (occurrence): chatbot (8) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily defines terms related to the Shipping Act and the functioning of common carriers in maritime transportation. It does not mention AI or any relevant concepts that connect to its categories such as Social Impact, Data Governance, System Integrity, or Robustness. Therefore, the relevance to AI-related aspects is non-existent.
Sector: None (see reasoning)
The text deals with the Shipping Act and definitions concerning maritime law and the obligations of common carriers. It does not address any specific issues related to the sectors outlined. There are no references to the use or regulation of AI in Politics and Elections, Government Agencies, the Judicial System, Healthcare, Private Enterprises, Academic Institutions, International Standards, Nonprofits, or any Hybrid or Emerging sectors.
Keywords (occurrence): automated (2) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Data Governance
System Integrity (see reasoning)
The text primarily discusses cloud computing services, cybersecurity requirements, and data management for government contractors. While it does contain references to automated audits and scans, which might suggest some relevance to the System Integrity and Data Governance categories, there is a lack of explicit AI context, such as discussions on algorithms, decision-making processes, or automated decision-making that directly involve AI technologies. Thus, the relevance of the categories is assessed based on the mentions of security and data management rather than AI-specific attributes.
Sector:
Government Agencies and Public Services (see reasoning)
The text is relevant to Government Agencies and Public Services as it outlines the requirements for cloud computing for government contracts, focusing on security, data management, and compliance with federal standards. Despite lacking specific AI mentions, it does relate to government operations and the need for secure data handling. Other sectors like Healthcare, Private Enterprises, and others are not applicable since AI usage or regulations aren’t explicitly addressed.
Keywords (occurrence): automated (3) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Data Governance
System Integrity (see reasoning)
This text primarily concerns the processes involved in calculating the Star Ratings for Medicare and Medicaid services. It details the methods for adding, updating, and removing measures, which involves data assessments and potential adjustments that seemingly rely on statistical methods and algorithms, albeit indirectly related to AI. However, the reference to 'clustering algorithms' hints at the use of algorithmic mechanisms that can be associated with AI. The main emphasis seems to be on data governance and integrity regarding health performance metrics rather than direct AI applications, indicating a need for careful consideration of data sources and processing methods, which encompasses the Data Governance category. The System Integrity category is also relevant as it addresses accuracy, reliability, and validity of measures and performance data, which can implicitly relate to controls and security procedures in AI, but not with enough explicit relevance to score highly. Social Impact seems less relevant, as while the outcomes may affect population health, the document does not discuss societal implications directly. Robustness is not significantly addressed since there is no mention of AI benchmarks or performance metrics in AI systems specifically.
Sector:
Healthcare (see reasoning)
The content of this text is closely related to the healthcare sector as it discusses the Star Ratings system for Medicare and Medicaid, focusing on performance metrics, data integrity, and management strategies which are significant for healthcare-based AI applications and evaluations. The discussions surrounding data collection and performance measures are specifically relevant to healthcare systems. Although the text does not address AI utilization in political or legal systems, the healthcare-specific focus aligns it closely with this sector. There is little to no mention of the other sectors outlined, thus they are rated as not relevant.
Keywords (occurrence): algorithm (2) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
System Integrity (see reasoning)
The text primarily addresses regulations and requirements surrounding the administration of Medicare Part D contracts, specifying functions related to prescription drug plans, auditing requirements, and compliance measures. Given the absence of any explicit mention or focus on artificial intelligence, this text largely lacks relevance to the predefined categories. However, some hints point towards potential data management and system integrity aspects, but these are not central to the outlined provisions. The main concerns revolve around operational requirements, compliance, and auditing rather than AI's impact or applications, indicating a low level of relevance overall.
Sector: None (see reasoning)
The text pertains mainly to the Medicare Part D program's operational rules, rather than any specific use or regulation of AI in notable sectors. While there are mentions of data management related to auditing and verification processes, there is no direct discussion of AI applications in the healthcare context. Therefore, it is not closely aligned with any specific sector but has slight relevance to governmental operations due to its nature as regulation within a federal health program. However, this relevance is not significant, leading to low scores across sectors.
Keywords (occurrence): automated (2) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Societal Impact
System Integrity (see reasoning)
The text discusses the automation of vital systems on self-propelled vessels. This clearly pertains to system integrity as it establishes regulations and requirements to ensure the safe and reliable automation of these systems. The mention of automated systems replacing personnel or reducing crew requirements suggests a potential social impact, particularly regarding labor implications. However, no elements related to data governance or robustness are directly addressed, leading to lower relevance scores for those categories. Overall, the text primarily aligns with system integrity and social impact due to its focus on the regulation and safety of automated systems on vessels.
Sector:
Hybrid, Emerging, and Unclassified (see reasoning)
The text primarily pertains to regulations for automated systems in the maritime sector, particularly focusing on vessel safety and operation. Although it mentions automation, it does not specifically indicate an application or regulation of AI technologies. However, the automation referenced could be indirectly related to AI systems. As such, while its direct implications for sectors like government agencies or public services are minimal, it provides a framework for operational safety in the context of maritime services. This leads to a moderate score overall for sectors like Hybrid, Emerging, and Unclassified but a lower score for other specific sectors mentioned.
Keywords (occurrence): automated (6) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily addresses contract reporting and file management within the context of the Department of Defense, specifically concerning the Federal Procurement Data System (FPDS). Although it mentions automated reporting processes, there is no explicit reference to AI, machine learning, algorithms, or any related technology that would align it with issues of social impact, data governance, system integrity, or robustness. The automation discussed is procedural and does not hint at the use of AI technology or frameworks. Therefore, all categories are scored as low relevance.
Sector: None (see reasoning)
This legislation does not specifically address the use or regulation of AI across any relevant sectors. It focuses on reporting and data management within defense contracting. As such, it does not fall under Politics and Elections, Government Agencies and Public Services, or any other specified sector dealing with AI applications. Therefore, all sector scores are also assigned low relevance.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text pertains primarily to the technical specifications and operational requirements of train brake systems. It does not explicitly mention AI or related concepts such as algorithms, machine learning, or automated decision-making systems. Therefore, it is not relevant to any of the AI-related categories, which focus on the societal impacts, data governance, system integrity, and robustness of AI technologies. The content is strictly about mechanical and safety systems within the context of railroad regulations, suggesting a score of 1 (not relevant) across all categories.
Sector: None (see reasoning)
The text also lacks direct relevance to any specific sectors related to AI. It speaks to regulations about train systems, which fall outside the scope of politics, government, healthcare, and other predefined sectors dealing with AI applications. The focus is strictly on operational safety and design standards for braking systems in passenger trains. Thus, it results in a score of 1 (not relevant) for all sectors.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text provided primarily describes procedural regulations related to procurement instrument identifiers (PIIDs) and does not make explicit references to Artificial Intelligence (AI) or any of its related terminologies. Therefore, it does not clearly address any social impact of AI, data governance issues surrounding AI, system integrity related to AI processes, or robustness in AI performance metrics. Given that there are no mentions of AI technology or its implications, all categories are rated low on relevance.
Sector: None (see reasoning)
The text discusses procurement procedures related to government contracts and identifiers but does not mention AI applications in any of the specified sectors. Furthermore, it lacks content discussing regulations specific to politics, government services, healthcare, or any other domains listed. Thus, it is not relevant to the sectors presented, and each sector is rated as not relevant.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text provided mainly focuses on the definitions and guidelines associated with the CHAMPUS (Civilian Health and Medical Program of the Uniformed Services) program, which pertains to the delivery and reimbursement of healthcare services to military beneficiaries. It does not explicitly refer to AI technologies or their implications. However, various sections touch on healthcare practices, procedural definitions, and regulations related to care management, which may indirectly relate to AI applications in the future, e.g., through healthcare automation. Nevertheless, these implications are too generic to warrant strong relevance. Therefore, the categories related to social impact, data governance, system integrity, and robustness receive low scores as they don't explicitly address AI-related issues or legislation pertinent to the categories' intended scope.
Sector: None (see reasoning)
The text revolves around regulations and definitions pertinent to the CHAMPUS program, specifically dealing with processes and procedures concerning military healthcare services. Only the healthcare sector can be loosely connected to the text through the context of providing medical care to military beneficiaries. However, the absence of specific mention of AI technologies or their applications in healthcare means that the relevance to the healthcare sector is still very limited. Thus, it receives a very low score.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The provided text predominantly pertains to Environmental Protection Agency (EPA) regulations regarding the Massachusetts State Implementation Plan (SIP). It does not include any explicit references to Artificial Intelligence or its applications, nor does it discuss implications related to the social impact of AI. While one could stretch to consider the implications of automation in environmental compliance, the text itself does not make any mention of AI systems or their governance, thus making it largely irrelevant to the specified categories of Social Impact, Data Governance, System Integrity, and Robustness.
Sector: None (see reasoning)
The text focuses exclusively on environmental regulation compliance and does not engage with any legacies around the sectors identified. There are no discussions of AI-specific regulations or applications impacting politics, public services, healthcare, employment, academia, or international standards. Thus, every sector categorization receives a score of 1, signifying that there is no relevance whatsoever.
Keywords (occurrence): automated (1)
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily focuses on environmental and effluent limitations related to the manufacture of liquid detergents. It does not contain specific references to artificial intelligence (AI) or related technologies. While there is mention of automated fill lines, this refers to operational processes rather than AI systems. Therefore, the text does not sufficiently address issues related to the social impact of AI, data governance, system integrity, or robustness. As such, all four categories are deemed not relevant to the content provided.
Sector: None (see reasoning)
The text mainly deals with regulatory standards for effluent discharges in manufacturing processes and does not reference any specific sectors such as politics, healthcare, or education in relation to AI. The mention of automated processes does not imply a broad application of AI in these sectors. Therefore, all sectors are considered not relevant to the content of the provided legislation.
Keywords (occurrence): automated (3) show keywords in context
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
This text appears to be primarily focused on monetary transaction systems and their regulatory exemptions, particularly relating to payment processing via electronic means like automated clearing houses and wire transfers. It's devoid of direct references to Artificial Intelligence and does not discuss social impacts, data governance, system integrity, nor robustness that relate to AI. Therefore, I find it irrelevant to all AI-related categories.
Sector: None (see reasoning)
The content of the text does not address the application, regulation, or implications of AI within any sector described. Instead, it centers on financial transaction systems and their regulatory environment. There is no mention of sectors like Politics and Elections, Government Agencies and Public Services, Judicial System, Healthcare, Private Enterprises, Academic Institutions, International Cooperation, Nonprofits, or Hybrid sectors. Thus, all sectors are deemed irrelevant.
Keywords (occurrence): automated (2)
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text concerns recordkeeping requirements under the Family and Medical Leave Act (FMLA), primarily focusing on obligations of employers to maintain employee records. The explicit mention of 'automated data processing memory' hints at potential use of AI or automated systems for record management, but does not delve into broader implications or applications of AI. Thus, none of the categories directly apply with significance as the focus is more on regulatory compliance than on AI impact, governance, integrity, or robustness. However, there is a marginal relevance to Data Governance regarding the protocols on recordkeeping, given the management of personal employee data and potential for automation in these processes. This relevance is minimal and stems largely from the mention of automated data systems rather than a direct focus on data management or governance practices.
Sector:
Private Enterprises, Labor, and Employment (see reasoning)
The text does not discuss any specific use of AI within sectors such as politics, healthcare, or any other sectors listed. It is primarily a regulatory text that deals with recordkeeping mandates for employers under the FMLA, without specific mention of AI applications or impacts in the specific contexts. The closest connection stems from the reference to electronic records in an employment context, but this is not substantial enough to tie it to a specific sector for meaningful categorization.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register
The text does not contain specific references to AI or related technologies. It focuses on environmental regulations regarding monitoring systems for emissions and performance testing requirements. Although there are contexts in which AI could be utilized in monitoring systems, the text does not address any AI implications or regulations related to social impact, data governance, system integrity, or robustness directly. Therefore, it is not suitable for any of the categories provided.
Sector: None (see reasoning)
Similar to the category assessments, the text does not pertain to any specific sectors listed. It deals primarily with environmental monitoring and regulatory compliance, which does not align with the sectors outlined. There is no mention of AI usage in politics, healthcare, public services, or any other sector, thus rendering it irrelevant to these categories.
Keywords (occurrence): automated (1) show keywords in context