4162 results:


Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily discusses criteria for case closure related to child support obligations and does not explicitly relate to any AI systems or methodologies. While the mention of 'automated locate effort' hints at possible algorithmic or automated processes, the overall content does not predominantly address AI's societal impact, data governance, system integrity, or robustness. Hence, the relevance of this text to the AI-centric categories is minimal.


Sector: None (see reasoning)

The text contains provisions specific to the IV-D agency and primarily concerns child support cases, which do not involve the application or regulation of AI. There is no reference to political campaigns or elections, government services through AI applications, judicial uses of AI, healthcare integration of AI, employment impacts, academic regulation, or international standards. Thus, the relevance to the sectors listed is negligible.


Keywords (occurrence): automated (3) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text predominantly deals with requirements for emergency calling services via the telecommunications system, specifically focusing on VoIP providers and the obligations they have to ensure accessible and reliable emergency services (E911) for users. It does not explicitly address Artificial Intelligence or its social implications, data management processes, system integrity, or legislation regarding performance benchmarks. The core subject matter is telecommunication regulations which might use algorithms for routing but lack specific references to the use or governance of AI technology itself. Hence, relevance to the defined categories is minimal.


Sector: None (see reasoning)

The text relates to telecommunications, primarily focusing on regulations for emergency calling through various telecommunication relay services. While these services might leverage technology, it does not specifically connect to any specific sector involving AI applications even in a peripheral way. The references are strictly about service implementation and operational requirements among telecommunications service providers and do not address the sector of politics, government, healthcare, or others as mentioned. Therefore, it receives low relevance across all sectors.


Keywords (occurrence): automated (4) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text does not explicitly mention any terms or concepts relating to AI. It primarily discusses payment procedures and contract management, including electronic funds transfer, performance-based payments, reporting, and contractor obligations. None of these aspects involve AI technologies, their social impact, data governance, integrity of systems, or robustness measures. Hence, all categories would be deemed irrelevant.


Sector: None (see reasoning)

The text focuses entirely on electronic payment processing and contractual obligations without delving into the use of AI in any sector. There are no references to political processes, government services, judicial matters, healthcare applications, employment practices, academic institutions, international cooperation, nonprofits, or the hybrid/emerging sectors that use AI. Therefore, all sectors score as irrelevant.


Keywords (occurrence): automated (2)

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

This text involves procurement and sales reporting related to environmental attributes and industrial funding fees but lacks any explicit references to AI technologies, applications, or governance. While discussions about automated processes in reporting could tangentially relate to 'Automation' or 'Data Governance', the text primarily concerns contractual obligations and sales processes without any clear implications or regulations regarding AI systems or their impacts. Thus, all categories will receive low relevance scores as they do not pertain directly to AI issues.


Sector: None (see reasoning)

The text discusses government procedures and regulations involved in sales reporting under funding programs. However, it does not directly address AI applications within these structures. Although aspects of data management are covered, they primarily apply to procurement and reporting practices rather than directly concerning sectors like healthcare, political campaigns, or rights of individuals related to AI systems. Thus, relevance to any sector is very limited and only marginally touches on data governance in a shallow way.


Keywords (occurrence): automated (2) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance (see reasoning)

The text revolves around the establishment and management of quiet zones for railroad crossings, focusing on risk reduction measures known as Alternative Safety Measures (ASMs). The relevance of the categories is assessed as follows: The Social Impact category is only slightly relevant as it touches upon public safety but does not explicitly deal with broader societal implications like accountability, discrimination, or consumer protections in the context of AI. Data Governance is moderately relevant due to the need for accurate data collection, monitoring, and analysis to determine the effectiveness of ASMs and their impact on safety. System Integrity does not apply well in this context, as the focus is on physical infrastructure and engineering metrics rather than the inherent security or transparency of AI systems. Robustness is not relevant since the text does not discuss performance benchmarks or auditing of AI systems. Therefore, the legislation primarily aligns with Data Governance.


Sector:
Government Agencies and Public Services (see reasoning)

The text pertains specifically to safety measures in the railroad sector and their implementation regarding risk management. For the sectors, Politics and Elections is not relevant as there is no mention of electoral processes or political campaigns. Government Agencies and Public Services is very relevant, as the legislation involves public authorities seeking to implement safety measures at railroad crossings. The Judicial System does not apply since neither legislative nor judicial use of AI is mentioned. Healthcare, Private Enterprises, Labor, and Employment, and Academic and Research Institutions are also not applicable as they do not relate to the text's content focused on transportation safety. International Cooperation and Standards is not relevant since there is no mention of cross-border cooperation. Nonprofits and NGOs and Hybrid, Emerging, and Unclassified sectors do not fit the text, which is specific to transportation. Therefore, Government Agencies and Public Services is the most fitting sector.


Keywords (occurrence): automated (4) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

This text primarily consists of procedural rules and definitions related to the Transportation Security Administration and does not explicitly address AI or its implications. Keywords associated with AI such as 'algorithm', 'automated', or 'machine learning' do not appear. Therefore, it lacks substantial relevance to the Social Impact, Data Governance, System Integrity, or Robustness categories, as there are no discussions around AI’s societal impact, data management, system security, or performance benchmarks.


Sector: None (see reasoning)

The text mainly details the definitions and terms relevant to the transportation sector, particularly regarding security protocols and regulations by the Transportation Security Administration. There is a lack of content specific to sectors related to politics, government services, healthcare, private enterprises, and others that involve AI regulation or applications. The focus remains on administrative and procedural terminology rather than direct AI applications, resulting in very low relevance across all defined sectors.


Keywords (occurrence): automated (3) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
Data Governance
System Integrity (see reasoning)

The text discusses the development and implementation of CCWIS (Comprehensive Child Welfare Information System) which involves automated functions. The use of AI in this context may lead to significant social impacts, such as improving the efficiency of child welfare systems and consumer protections related to these automated systems. However, it does not explicitly address issues of bias, fairness, or the influence of AI on public trust, which are essential for a higher relevance in the Social Impact category. The text implies concerns around data governance due to its focus on secure data exchanges and automated functions, suggesting a relevance in ensuring data accuracy and project oversight. Similarly, principles of System Integrity are relevant since the text discusses requirements for automation and accountability in design and function. Robustness does not strongly apply as there is no mention of performance benchmarks or auditing processes for the AI systems mentioned. Overall, the presence of automation and data governance points to moderate relevance in the chosen categories.


Sector:
Government Agencies and Public Services (see reasoning)

The CCWIS project appears to target the public service sector, specifically child welfare services, involving automated functions to enhance efficiency and effectiveness of service delivery. The legislation outlines requirements that are specific to government operations, including data management and project funding which align it closely with the Government Agencies and Public Services sector. The content does not specifically address AI use in politics, judicial contexts, healthcare, or private enterprise, indicating low relevance in those areas. The text does not touch upon academic regulation, international standards, or nonprofit applications of AI either. However, given the focus on public welfare and government agency operation, the text is most pertinent to the Government Agencies and Public Services sector.


Keywords (occurrence): automated (4) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
System Integrity (see reasoning)

The text primarily discusses procedural regulations related to child support orders and enforcement rather than Artificial Intelligence (AI) applications or implications. The relevant point is the mention of 'automated methods' used for reviewing and adjusting child support orders. However, the focus is significantly on existing regulations and procedures for human oversight in enforcement. The mention of automated methods does touch on automation, but it does not delve into issues of fairness, bias, or societal impact heavily tied to AI applications. Therefore, while it has elements of automation, it does not sufficiently engage with the core interests of the provided categories to warrant a high relevance score.


Sector:
Government Agencies and Public Services (see reasoning)

The text, while it has some implications for governmental procedures, does not provide significant insights into the regulation of AI in critical sectors. It deals more with child support procedural frameworks rather than specific AI applications within any sector. The mention of 'automated methods' could hint at AI, but the overall context is more about human and governmental processes, which leads to lower scores across all sectors.


Keywords (occurrence): automated (2) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
System Integrity (see reasoning)

The text primarily focuses on operational procedures and regulatory requirements for public coast stations and automated maritime telecommunications systems. While it mentions automation in the context of telecommunications, there is little direct linkage to broader social impact, data governance, system integrity, or robustness pertaining to AI itself. As such, relevance to these categories is minimal. However, the presence of automated systems implies an underlying framework for decision-making processes that could relate to system integrity, but the specifics here do not match strongly enough for high relevance.


Sector:
Government Agencies and Public Services (see reasoning)

The text discusses regulatory frameworks specifically related to automated maritime telecommunications systems, addressing operational compliance but not significantly delving into the sectors such as politics, healthcare, or judicial contexts. It does touch on government regulations concerning public services via coast stations, but this application doesn't extend deeply into governmental operational frameworks with AI. It is moderately relevant to the 'Government Agencies and Public Services' sector due to its regulatory nature.


Keywords (occurrence): automated (1)

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text discusses obligations related to telecommunications providers regarding the transmission of 911 calls and the provision of dispatchable location information. However, it does not explicitly reference any AI-related technologies, concepts, or practices. While automated dispatchable locations involve technology that may include algorithmic elements, AI in a broader context is not directly relevant since the text focuses on telecommunication regulations rather than AI's societal implications, data governance, system integrity, or robustness. Thus, it lacks substantial relevance to the predefined categories concerning AI-related legislation.


Sector: None (see reasoning)

The text addresses obligations for telecommunications providers but does not specifically discuss the use of AI in political, governmental, judicial, healthcare, commercial, academic, international, or nonprofit contexts. The focus remains on emergency services and telecommunications regulations, rather than AI’s influence or application within these sectors. The mention of automated systems implies some technological reliance, but it does not directly tie into any specific sector's legislative needs. Hence, the overall relevance is minimal.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
System Integrity
Data Robustness (see reasoning)

This text addresses the certification and operational requirements for automated systems in marine environments, focusing on safety control and the design of control systems. Given the discussion on automated systems and the requirement for certification, the relevance of the categories can be evaluated as follows: Social Impact is slightly relevant as it touches on safety but does not directly address broader social implications. Data Governance is not relevant as there is no mention of data collection, management or privacy issues. System Integrity gains some relevance as the text outlines the need for safety control systems and manual override capabilities in automated systems, indicating a focus on security and transparency. Robustness is relevant due to the emphasis on system performance requirements to ensure stability and safety in automated systems, which aligns with the need for benchmarks and compliance. Overall, the legislation is more focused on safety and control within automated systems rather than broader societal or data governance implications.


Sector:
Government Agencies and Public Services (see reasoning)

The text primarily addresses regulatory compliance in the context of automated systems used in marine environments. While not directly linked to any specific sector outlined, it has implications for several sectors. For example, it relates to Government Agencies and Public Services, as it involves compliance with Coast Guard requirements. However, it does not involve judicial implications, healthcare applications, or employment issues, which is why specific sectors may not receive high scores. Furthermore, the text pertains to safety controls which are relevant to the functioning of systems in government public services. Overall, the text bears relevance to Government Agencies and Public Services due to its nature of compliance with government regulations for automated systems.


Keywords (occurrence): automated (3) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
System Integrity (see reasoning)

The text discusses the automation of machinery spaces in vessels, focusing on the role of automated systems replacing or reducing crew members. The term 'automation' is relevant to AI as it relates to systems designed to operate with minimal human intervention, which is closely aligned with AI technologies. However, the text is more procedural and does not deeply explore the societal impacts, data governance specifics, or the robustness standards typically associated with AI development and deployment. Therefore, while it touches on AI-related themes through automation, the systemic implications of AI on society, data handling, or integrity are not explicitly addressed.


Sector:
Government Agencies and Public Services (see reasoning)

The text primarily pertains to regulations around marine vessels and their operational standards rather than sectors explicitly tied to AI implementation across politics, healthcare, or civil services. The mention of automation may suggest an overlap with government agency operations in overseeing safety standards for vessels, but it does not specifically address governmental usage or implications of AI in public service contexts. Therefore, the relevance to the detailed sectors is limited. The most relevant sector indicated relates to government agencies, given the legislative oversight by the Coast Guard.


Keywords (occurrence): automated (3) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily details regulations governing Low Power Radio Service (LPRS) and mentions aspects such as interference, operational procedures, and antenna requirements. However, there is no explicit mention of Artificial Intelligence, Machine Learning, Algorithms, or any other AI-related terms. Consequently, the relevance of this text to the provided categories is low. There are some implications regarding data and telecommunication systems that could tangentially relate to System Integrity and Data Governance, however, they do not explicitly connect to AI, which diminishes their relevance. Overall, the regulation does not address any direct implications or considerations for AI systems or technology.


Sector: None (see reasoning)

The text applies to telecommunications regulations specifically regarding the Low Power Radio Service (LPRS) and does not relate to the predefined sectors, which include AI applications across different fields. There is no discussion on AI's role in politics, public services, healthcare, or other avenues outlined in the sectors. The text strictly adheres to technical regulations governing communications, with no references to AI processes or implications. It falls outside any sector defined. Thus, all sectors score a 1 for not being relevant.


Keywords (occurrence): automated (1)

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

This text primarily focuses on the regulations of communication systems, particularly in maritime contexts, and does not explicitly address any topics related to AI. The discussion surrounding automation in communication systems, while relevant to AI, does not establish a clear link to the broader implications of AI on society, data governance, system integrity, or robustness. Terms such as 'fully automated system' appear, but they do not delve into concerns typically associated with AI like bias, transparency, or data management. Thus, the text’s relevance to the defined categories is minimal.


Sector: None (see reasoning)

The text pertains largely to telecommunications and maritime safety protocols. While it mentions automated systems, which could implicitly relate to AI, it does not delve into applications or regulations specific to any sector like healthcare, public services, employment, or others defined in the categories. The focus remains strictly on communication protocols and regulations rather than the application of AI in any specific sector. Therefore, the relevance is low across the sectors.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text focuses primarily on regulations for the operation of broadcasting stations during emergencies, particularly in relation to the Emergency Alert System (EAS). While aspects of automation in broadcasting are discussed, such as unattended operation which may imply the use of automated systems, there is no explicit reference to AI technologies or algorithms as defined in the provided keywords. Therefore, the relevance to the categories is minimal. The operational aspects do touch on compliance and monitoring, but they are primarily related to safety and technical standards rather than the societal implications of AI or governance of AI systems.


Sector: None (see reasoning)

The text discusses broadcasting regulations primarily focused on emergency operations, which indirectly pertains to government agencies and public services in managing broadcasts during emergencies. Nonetheless, it does not engage extensively with AI or its specific applications within governance or public services. Thus, the relevance is minor and does not strongly align with any of the predefined sectors.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily outlines standards and procedures related to laboratory controls and testing methods in various biological and chemical contexts. While it pertains to rigorous testing and control processes, it lacks explicit references to AI-related concepts. Thus, it's challenging to tie these standards to the categories associated with AI. This results in very low relevance for all categories. Specifically, it does not address societal impacts, data governance, system integrity, or robustness in a way that relates to AI developments or practices, as it focuses on laboratory standards rather than AI systems or applications.


Sector: None (see reasoning)

The text describes standard procedures in laboratory settings without any implications or direct connections to the aforementioned sectors. While it mentions procedures important for health diagnostics, there is no indication that these are related to AI applications within healthcare or any other specified sector. Hence, the scores remain very low across all sectors.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance
System Integrity (see reasoning)

The text discusses the procedures and requirements for income and eligibility determinations by state agencies in relation to automated systems and computer matching programs. It references the use of automated systems to verify applicant eligibility which aligns with the Data Governance category due to its focus on managing and securing data in automated processes. It also relates to System Integrity as it mentions the need for independent verification of the data to ensure integrity and avoid adverse actions based on potentially inaccurate information. However, it does not provide enough context regarding broader societal impacts or performance benchmarks to be relevant for Social Impact or Robustness categories.


Sector:
Government Agencies and Public Services (see reasoning)

The text is primarily concerned with the operations of state agencies regarding eligibility determinations for social services. It does not specifically mention AI but implies the use of automated systems for data processing and management, which can be relevant to Government Agencies and Public Services. The mention of automated systems suggests operational efficiency in delivering public services, leaning towards moderate relevance for this sector, while other sectors like Healthcare or Private Enterprises do not apply to the text.


Keywords (occurrence): automated (3) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily outlines the procedures for handling unsolicited proposals within the Department of Energy (DOE) and does not explicitly discuss AI, algorithms, or any of the other relevant terms related to AI. While it may indirectly pertain to technological innovations in general, there is no specific mention or context that ties it to AI's social impact, data governance, system integrity, or robustness. Therefore, all categories receive low relevance scores.


Sector: None (see reasoning)

The text deals with agency procedures related to unsolicited proposals and does not address the specific application or regulation of AI in any contexts such as politics, public services, healthcare, etc. Since AI is not mentioned, and the focus is on proposal management rather than any relevant sector application, all sector scores are also low.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
System Integrity (see reasoning)

The text discusses regulations concerning automated systems on vessels, particularly emphasizing safety measures and standards for automated vital systems. It highlights the importance of ensuring that automation does not compromise vessel safety. This relates closely to the item concerning 'System Integrity,' as it addresses the need for oversight and safety controls in automated operations. 'Robustness' could also be relevant due to the safety evaluations of these systems, but without explicit mentions of benchmarks or certifications, it is less relevant. 'Social Impact' and 'Data Governance' are less applicable as the text does not address broader societal implications of AI, such as fairness, accountability, or data management issues. Overall, the text is most relevant to 'System Integrity' and carries some relevance to 'Robustness.'


Sector:
Government Agencies and Public Services (see reasoning)

The text pertains to automated vessel operation, a system directly influenced by AI technology, which can be categorized under 'Government Agencies and Public Services' since it outlines regulations enforced by the Coast Guard under the Department of Homeland Security. This sector encompasses the utilization and regulation of technology to enhance public services. There is minimal relevance to other sectors such as 'Judicial System' or 'Healthcare.' As such, the most appropriate classification is within 'Government Agencies and Public Services.'


Keywords (occurrence): automated (6) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text focuses on procedures and policies for electronic funds transfer (EFT) related to government contracts, with no explicit connection to AI technologies or implications. It discusses payment methods, mandatory submission of contractor information, and protocols for electronic transactions, but lacks any mention of AI, algorithms, machine learning, or related concepts. Consequently, it does not fit within any of the given categories, which focus on impacts, governance, integrity, or robustness of AI systems.


Sector: None (see reasoning)

The text primarily addresses payment infrastructural details without any reference to AI application in any particular sector. It does not discuss political campaigns, government functions, legal frameworks, healthcare, business practices, educational contexts, international standards, nonprofit considerations, or emerging sectors. There is no indication that AI is part of the transportation, processing, or operation of electronic funds transfers as described, indicating a complete lack of relevance to the specified sectors.


Keywords (occurrence): automated (2)
Feedback form