4160 results:


Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text discusses regulations concerning limited quantities of compressed gases and the associated packaging and labeling requirements. However, there is no explicit mention of AI, algorithms, machine learning, or similar topics that would link the document to the predefined categories such as Social Impact, Data Governance, System Integrity, or Robustness. While it mentions automated processes for pressure testing, it does not delve into the ethical or societal implications or standards typically associated with AI, making the content primarily focused on material safety rather than AI-related legislation.


Sector: None (see reasoning)

The text does not relate to any specific sectors as defined. It does not mention politics, government operations, healthcare, or any areas where AI is notably applied. The reference to automated processes for testing might connect slightly to the broader concept of government regulation of technological processes but does not sufficiently align with any defined sector to warrant a higher score, as the main focus remains on gas regulations rather than AI applications.


Keywords (occurrence): automated (2) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

This document primarily relates to procurement and payment processes in government contracts. It does not directly address the societal impacts, data governance, system integrity, or robustness of AI systems, as it lacks mention of specific AI-related technologies or implications. As such, the relevance to any of the categories is exceedingly low.


Sector: None (see reasoning)

The text discusses procedures related to contracting and payments within government contracts, and while this may touch on government transactions that could potentially involve automated systems or digital processes (like electronic invoicing), it does not specifically address the application or regulation of AI within any sector. This leads to a very low relevance for all specified sectors.


Keywords (occurrence): automated (1)

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text does not contain any explicit references to Artificial Intelligence (AI) or related terms such as Algorithm, Machine Learning, Neural Network, etc. It primarily discusses safety procedures related to railroad operations, particularly concerning the management and safety of roadway work groups operating adjacent to tracks. Consequently, it has no relevance to the existing categories which are focused on social impacts, data governance, system integrity, and robustness of AI systems.


Sector: None (see reasoning)

The text also fails to address legislation or guidelines specifically impacting sectors defined in the sectors list such as Politics and Elections, Government Agencies, Healthcare, etc. Its focus is solely on railroad safety regulations without mentioning AI applications or regulations in these sectors. Therefore, it does not align with any of the defined sectors.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance
System Integrity (see reasoning)

The text outlines regulations regarding disclosure requirements for Medicare Advantage plans, focusing primarily on how organizations need to communicate details regarding coverage, benefits, and rights of enrollees. While it doesn't explicitly mention AI, the relevance lies in how data, potentially algorithmically processed or influenced by AI systems, must be presented. The text implicitly touches on themes of transparency and accountability, which are crucial in the context of AI systems that collect and analyze data for medical or health-related decisions. Therefore, it is moderately relevant to the category of System Integrity due to the emphasis on secure, transparent communication and oversight requirements. The Social Impact category is slightly relevant since it pertains to consumer protections but doesn’t directly address the social consequences of AI. Data Governance is moderately relevant as well since issues of accuracy, bias, and privacy in data disclosures may involve AI applications. Robustness is less relevant since the text does not focus on performance benchmarks for AI systems.


Sector:
Healthcare (see reasoning)

The text is specifically directed towards Medicare Advantage plans, which puts it directly within the realm of Healthcare. Since it outlines requirements for disclosures and communication critical for enrollees, it is very relevant to the Healthcare sector. It doesn’t directly address aspects of Politics and Elections, Government Agencies, the Judicial System, or any other mentioned sectors. Thus, Healthcare is assigned a high relevance score while other sectors receive a score of 1 due to lack of mentions.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily centers on invoicing procedures and requirements for contractors working with the Environmental Protection Agency. There are no explicit references to artificial intelligence or related concepts such as algorithms, automation, or machine learning. Therefore, the relevance to AI-related categories is minimal. The focus is on contract compliance and payment processing rather than any implications of AI or technology's impact on society, data governance, system integrity, or AI robustness.


Sector: None (see reasoning)

While the text pertains to contracting and invoicing, it does not address the use or regulation of AI technologies within government operations or services. There are no discussions regarding political campaigns, judicial processes, healthcare applications, or employment matters that involve AI. The text purely deals with contract and invoicing matters without any direct impact on the listed sectors.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text provided primarily focuses on regulations regarding the transport of hazardous materials, including the specific handling and limitations of such materials when carried by aircraft passengers or crew members. It does not address the social implications of AI or automated decision-making, data governance of AI systems, or integrity and robustness as applied to AI technologies explicitly. Therefore, it does not relate to the AI-related portions that would require these categories of assessment. Hence, all categories are evaluated as not relevant due to a complete lack of direct AI context.


Sector: None (see reasoning)

The text pertains to aviation regulations regarding hazardous materials, which does not relate to the usage or regulation of AI across the specified sectors. The sectors such as politics and elections, government services, healthcare, etc., are not referenced or implied in the text. Hence, none of the sectors are relevant, scoring all as not relevant.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text focuses primarily on fishing quotas and management practices under the Individual Transferable Quota (ITQ) Program without addressing AI or its implications in any form. There are no references to the specified AI-related terms and concepts, rendering the text largely irrelevant to the categories of Social Impact, Data Governance, System Integrity, and Robustness. As such, a score of 1 in all categories is justified, as there is simply no connection to artificial intelligence-related legislation or considerations.


Sector: None (see reasoning)

The text does not pertain to any of the predefined sectors of legislation regarding politics, governance, the judiciary, healthcare, business, education, international standards, nonprofit activities, or emerging sectors. It is centered entirely on fisheries management under specific regulatory frameworks. Thus, each sector score is 1, indicating no relevance.


Keywords (occurrence): automated (1)

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text does not discuss AI technologies or their implications, thus there is no relevance to the categories regarding social impact, data governance, system integrity, or robustness. It solely details regulations concerning individual fishing quota programs without intersection to AI-related content.


Sector: None (see reasoning)

The text pertains exclusively to fishing regulations and quota management, and does not connect to any of the sectors regarding politics, government, the judicial system, healthcare, business, academic institutions, international standards, nonprofits, or emerging sectors. The focus is on fisheries management without intersection to AI applications or implications.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text largely discusses the requirements and regulations related to the provision of 911 services, particularly focusing on location accuracy and interconnection for CMRS (Commercial Mobile Radio Service) providers. There is no explicit mention of AI-related technologies or applications within the text. The regulations revolve around telecommunications standards rather than AI applications, thus indicating low relevance to the categories of social impact, data governance, system integrity, and robustness. For example, while the implications of technology could involve AI indirectly, the specifics provided do not pertain to AI's definitions or impacts. Therefore, scores reflect a clear absence of AI relevance in the context of provided categories.


Sector: None (see reasoning)

The text focuses on regulations regarding 911 service access and requirements for mobile service providers. Although it involves public safety measures, it does not specifically address AI applications or implications within sectors such as politics, healthcare, or employment. Rather, it deals with the structure and performance standards necessary for emergency communication technology, thus warranting low relevance scores across the defined sectors.


Keywords (occurrence): automated (2) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text does not reference AI or any related terms that are within the provided keywords. As such, none of the categories are applicable to the content of the text. There are no mentions of social impact from AI, data governance pertaining to AI, system integrity issues, or benchmarks and robustness measures related to AI performance. Therefore, relevance is rated as Not relevant for all categories.


Sector: None (see reasoning)

The text does not discuss the use or impact of AI within any specified sectors. It primarily focuses on the Head Start program and related administrative criteria without reference to political processes, governmental use of AI, judicial applications, healthcare implications, business contexts involving AI, academic AI usage, international standards on AI, nonprofit engagement with AI, or emerging sectors. Thus, all sector relevance scores are rated as Not relevant.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance
System Integrity (see reasoning)

The text discusses information and information systems security requirements but does not explicitly address the aspects of AI technology or the implications of AI systems on society, privacy, or operational integrity. Therefore, it does not fall under Social Impact as there are no references to AI-related societal issues or effects. It partially touches upon Data Governance through the regulations regarding the protection and management of sensitive data (e.g., PHI), which could include AI data usage, but the primary focus is not on AI-specific data governance principles. There are references to system integrity in terms of security controls and data management protocols, reflecting aspects of oversight but lacks direct mention of AI performance or systemic controls. Lastly, the text does not engage with the concept of robustness or benchmarks specific to AI performance. Overall, while elements of the text are relevant to data protection and security, they do not strongly correlate with the legislative focus on AI systems.


Sector:
Government Agencies and Public Services (see reasoning)

The text applies largely to the management and regulation of information and information systems security for the Department of Veterans Affairs, which could relate to various sectors but does not specifically align with any sector focusing on AI use. The references to privacy and data security do have applicability to Government Agencies and Public Services but do not directly address the role of AI within these contexts. There is also some indirect relevance to the Healthcare sector because of the mention of protected health information (PHI) under HIPAA; however, it doesn't discuss the use of AI technologies in healthcare practices. Given that the text does not cleanly categorize into any single sector but rather exists in an administrative capacity regarding information handling, its overall relevance remains limited.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily lays out various regulations and procedures for broadcast services while featuring terms relevant to automated systems rather than broader implications of AI's societal or ethical impacts. The presence of terms like 'Automatic transmission system' (ATS) indicates some technological aspects, but does not significantly engage with the wider social, data governance, system integrity, and robustness frameworks of AI which pertain to more abstract regulations and societal impacts. Therefore, these categories receive low scores due to the lack of explicit connections to key AI considerations, though there is a very slight touch on system integrity with the mention of automatic systems, justifying a slightly higher score for that aspect.


Sector: None (see reasoning)

The text outlines FCC regulations for broadcast communications without addressing distinct sectors that dovetail with AI applications. While it mentions 'Automatic transmission system' which could imply some connection to automation in broadcast processes, the overall context does not encompass regulations or discussions specific to sectors like politics, healthcare, government services, or others mentioned. Thus, sector relevance remains very low, earning minimal scores across the board.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily focuses on the Individual Fishing Quota (IFQ) program for Gulf groupers and tilefishes, detailing the requirements for participation, account management, allocation, and related operational aspects. While it mentions the use of electronic systems for managing IFQ accounts, it does not explicitly address themes related to AI, such as fairness, bias, system accountability, security, or performance benchmarks. Therefore, relevance to the defined categories is minimal.


Sector: None (see reasoning)

The text deals primarily with regulations around fishing quotas and does not touch on the use of AI or the specified sectors. It does mention the use of electronic systems but does not delve into applications that would involve AI's role in any of the defined sectors. Hence, relevance to the defined sectors is low across the board.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily discusses technical requirements concerning the operation of U-NII devices, focusing on power spectral density, conducted output power, and operational limits within specific frequency bands. It does not address aspects concerning the social impact of AI, data governance, system integrity, or robustness as defined in the categories. The language used is highly technical and regulatory in nature, without any references to the societal implications or oversight of AI systems, nor does it discuss principles like data handling or integrity breaches. Given the lack of relevance to AI legislative concerns, the text scores 1 across all categories.


Sector: None (see reasoning)

This text outlines technical requirements regulated by the FCC for U-NII devices, focusing on parameters like power restrictions and emissions. It does not mention the application of AI across various sectors including politics, government, healthcare, or others in its descriptions or implications. Instead, it relates to telecommunications technology broadly. As such, the text scores 1 in terms of relevance to sectors, providing no insight into how AI may impact these areas or how it is being governed within them.


Keywords (occurrence): automated (2)

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily deals with fishery management limits, allocations, and regulatory specifications related to the catch limits of specific species, which does not directly involve Artificial Intelligence (AI) or any related terminology. The focus is on biological assessments and fishing quotas rather than any social impacts, data governance, system integrity, or robustness in AI contexts. Therefore, the relevance of the text to AI-related legislative categories is minimal.


Sector: None (see reasoning)

The text pertains mainly to fishery management practices and regulations, addressing parameters set by fishery councils. It does not explicitly connect to any defined sectors related to the application and regulation of AI in areas such as government, healthcare, or international standards. The discussions do not touch on political activities, public services, healthcare, or judicial processes in relation to AI. It is purely regulatory regarding the seafood industry, making it irrelevant for the sectors being assessed.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

This text centers primarily on the regulations surrounding closed captioning for video programming. It does not specifically discuss AI technologies or their applications, but it does mention 'automated software' regarding closed captioning creation in Section (e)(3). However, this mention is not substantial enough to suggest a significant connection to AI as defined by the provided category descriptions. Thus, the relevance concerning social impact, data governance, system integrity, and robustness is limited, as they address broader or more complex frameworks related to AI rather than the singular focus on captioning. Overall, the direct impact on AI-related social issues, data management, or system integrity concerns is not sufficiently articulated in the provided text.


Sector: None (see reasoning)

The text primarily pertains to existing regulations regarding closed captioning for broadcast content, which does not explicitly fall under the categories of the specified sectors such as politics, government services, health care, etc. While there is a mention of automated processes, it is not explorative regarding AI in a legislative context. The regulations are mostly operational and technical and do not discuss the integration of AI technologies into any sector. Therefore, the relevance of the text to the sectors listed is minimal to non-existent.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily deals with exceptions to referral prohibitions, compensation arrangements, and financial relationships in the context of healthcare. It does not directly mention or pertain to artificial intelligence or its related technologies such as algorithms, machine learning, or automation. As such, none of the categories, particularly concerning social impact or data governance, are relevant. Although some aspects of the text could tangentially relate to system integrity and robustness in terms of operational procedures, there is no explicit mention of AI systems or their performance metrics. Therefore, the relevancy scores for all categories is determined to be very low.


Sector:
Healthcare (see reasoning)

The text focuses solely on healthcare regulations regarding financial relationships and exceptions for referrals. While it does discuss healthcare providers and institutional arrangements, it does not directly address or regulate the use of AI in healthcare, making its relevance to the specified sectors quite limited. The healthcare sector may receive a slight score due to general healthcare management context, but overall, the document does not engage with any aspects of AI applications in health settings or any implications on public services or governance structures related to AI usage.


Keywords (occurrence): automated (1) show keywords in context

Collection: Congressional Hearings
Status date: Oct. 19, 2023
Status: Issued
Source: House of Representatives

Category: None (see reasoning)

The text primarily discusses the enforcement of the Uyghur Forced Labor Prevention Act (UFLPA) by the Department of Homeland Security, which ties to a number of critical issues, particularly around social implications and system integrity. However, there is no mention of AI or any associated technologies, which would fall under the categories of Social Impact, Data Governance, System Integrity, or Robustness associated with AI specifically. The focus is on legislative oversight and human rights, rather than algorithmic processes or AI applications that could be governed or regulated. Therefore, the relevance of AI-related categories is minimal.


Sector: None (see reasoning)

The text revolves around the government's approach to combat forced labor, particularly concerning Uyghur rights and enforcement measures associated with that legislation. Despite its significant social implications, it does not significantly address or regulate the use of AI in any sector. Thus, while it has sociopolitical implications, its relevance to specific sectors is absent as the text fails to mention or correlate with issues directly related to the legislative functions in the mentioned sectors. The discussion is unanchored from AI applications, making its relevance to these sectors negligible.


Keywords (occurrence): automated (1) show keywords in context

Collection: Congressional Record
Status date: Nov. 8, 2023
Status: Issued
Source: Congress

Category:
Societal Impact (see reasoning)

The text contains a portion explicitly discussing 'Advances in Deepfake Technology', which falls under the Social Impact category as it pertains to societal consequences such as misinformation and psychological impacts stemming from deepfakes. In Data Governance, the relevance is more indirect, as while deepfake technology requires data management, the text does not mention specific regulations regarding data collection or management processes. System Integrity is not directly referenced; however, it could relate to the need for oversight on deepfake technology. Robustness is similarly less relevant as the article does not speak explicitly about the performance benchmarks for AI technologies or compliance verification. Overall, while some associations are noted, the social implications are the strongest link, particularly regarding its impact on trust and public perception.


Sector: None (see reasoning)

The text discusses deepfake technology in a context relevant to cybersecurity, which could impact various sectors. However, it does not directly address the use of AI in any particular sector extensively, limiting the relevance for specific sector classifications. Therefore, all sectors receive low scores since the focus is primarily on a technological advancement rather than application across sectors.


Keywords (occurrence): deepfake (2)

Collection: Congressional Hearings
Status date: Sept. 27, 2023
Status: Issued
Source: House of Representatives

Category:
Data Governance (see reasoning)

The text primarily discusses the role of science and technology at the Environmental Protection Agency (EPA) and its impact on regulatory and deregulatory decision-making. While it highlights the need for scientific integrity and the importance of solid data, it does not specifically address the various broader implications of AI on society or its governance. As such, there are no direct references to AI keywords; however, the mention of data management and the need for scientific integrity can imply an indirect relationship to Data Governance. However, the strongest ties lie within broader science and technology rather than AI-specific topics.


Sector:
Government Agencies and Public Services (see reasoning)

The text pertains to the regulatory activities of the Environmental Protection Agency, which falls under the Government Agencies and Public Services sector. It discusses the importance of accountability and the scientific process in EPA's work. Despite the lack of specific mentions of AI, the roles of technology, science, and how they affect public services make it relevant to this sector, though not directly addressing AI applications.


Keywords (occurrence): automated (2) show keywords in context
Feedback form