4167 results:


Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily discusses on-board diagnostics (OBD) for engines over 14,000 pounds GVWR, which is related to monitoring engine performance and emissions rather than direct applications or implications of AI. Although the content involves monitoring systems similar to how AI could monitor or analyze data, it does not specifically engage with AI technologies or discuss their societal impacts, data governance, system integrity needs, or robustness standards. As the content lacks a direct relevance to the nuances of AI technology and regulations that govern these aspects, the scores for all categories will reflect this lack of relevance.


Sector: None (see reasoning)

The text concerns regulations regarding on-board diagnostics in heavy-duty engines rather than legislation about AI applications across different sectors. The categories provided focus on the role and regulation of AI across various sectors like politics, healthcare, and employment. There is no indication in the text that AI technologies are being used, evaluated, or legislated within the context of these regulations, thus the relevance scores will also be minimal.


Keywords (occurrence): algorithm (3) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
Data Governance
System Integrity
Data Robustness (see reasoning)

The text discusses regulations regarding devices that use software algorithms for the optical measurement of heart and respiratory rates, highlighting the need for validation, verification, and cybersecurity measures. The references to software algorithms for medical measurements indicate a relevance to both the Social Impact and Data Governance categories. The potential implications on patient safety and user error implications lend weight to the Social Impact aspect, while the necessity of data management (accuracy and bias) relates to Data Governance.


Sector:
Healthcare (see reasoning)

The text is primarily focused on healthcare regulations given that the devices discussed relate to medical measurements such as pulse and heart rates. The controls and software requirements, including performance testing and human factors assessments, show a direct application in healthcare settings. Although there are no explicit mentions of other sectors, the specific focus on the medical context of these devices solidifies their classification under Healthcare.


Keywords (occurrence): algorithm (2) show keywords in context

Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily focuses on the definitions and acronyms relating to classified information security within the Department of Defense and does not directly address AI-related issues. Therefore, it is not relevant to Social Impact, Data Governance, System Integrity, or Robustness categories. These categories are fundamentally concerned with aspects of AI's societal implications, data handling regulations, transparency in AI systems, and performance benchmarks, none of which are explicitly present in the text. Hence, all categories will score a 1 for 'Not relevant.'


Sector: None (see reasoning)

The text primarily deals with frameworks for safeguarding classified information and definitions related to various government and contractor roles. It does not specifically address how AI intersects with the sectors defined, including Politics and Elections, Government Agencies and Public Services, Judicial System, Healthcare, Private Enterprises, Labor and Employment, Academic and Research Institutions, International Cooperation and Standards, Nonprofits and NGOs, or Hybrid, Emerging, and Unclassified. As such, it receives a score of 1 for 'Not relevant' across all sectors.


Keywords (occurrence): automated (1) show keywords in context

Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance
System Integrity (see reasoning)

The text primarily focuses on metadata requirements for recordkeeping rather than discussing AI systems directly. However, it is relevant to the 'Data Governance' category as it specifies requirements for the collection and management of data, which could involve AI systems that process or digitize records. 'Robustness' could be considered somewhat relevant as well, since ensuring accuracy and consistency in metadata may align with benchmarks for performance, especially as these could relate to any AI systems involved. The sections related to human oversight and validation processes hint at an aspect of 'System Integrity', but this is more about procedural adherence than systemic integrity of AI. There is no clear relevance to 'Social Impact'.


Sector:
Government Agencies and Public Services (see reasoning)

The text covers requirements that relate mainly to data management and digitization processes which could potentially involve AI technologies in a government context. However, there is little direct reference to applications in specific sectors such as healthcare, judiciary, etc. The mention of records management aligns somewhat with 'Government Agencies and Public Services', but it does not directly delve into AI applications in these sectors. Thus, the relevance is moderately acknowledged.


Keywords (occurrence): algorithm (3) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text does not discuss any issues related to social impact, data governance, system integrity, or robustness of AI systems, nor does it employ terms associated with AI technologies like algorithms, machine learning, or automated decision-making. It primarily focuses on the regulations concerning dual trading of security futures products and related market behavior, which are unrelated to the impact of artificial intelligence or the governance of AI technologies.


Sector: None (see reasoning)

The text does not specifically mention or address AI applications within politics, government functions, legal proceedings, healthcare, business environments, academic contexts, international standards, or NGO involvement. It focuses solely on trading regulations and does not revolve around AI technology or its impacts on any of these sectors.


Keywords (occurrence): algorithm (3) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The provided text primarily deals with the regulation and management of Electronic Logging Devices (ELDs) used by motor carriers to track drivers' duty status. There is no mention of artificial intelligence, algorithms, machine learning, or other AI-specific terminology that would categorize this legislation in relation to the social impact of AI, data governance pertaining to AI, system integrity relating to AI, or the robustness of AI systems. Therefore, the relevance of all the categories is minimal.


Sector: None (see reasoning)

The text focuses on the regulatory framework for electronic logging and data management within the transportation sector, particularly concerning drivers and motor carriers. It does not address AI applications, nor does it specify the use of AI in enhancing governmental operations or the involvement of AI in political contexts. Thus, it is not relevant to any of the specified sectors.


Keywords (occurrence): algorithm (2)

Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The provided text focuses on technical specifications and procedures related to measurement instruments used in emission testing, emphasizing engineering standards and practices. There is no specific mention of Artificial Intelligence or related technologies in the text. Therefore, it's unlikely to impact or relate to the categories of Social Impact, Data Governance, System Integrity, or Robustness as these categories deal with the broader implications and regulatory concerns about AI. As none of the defined AI-related terms are present, I will assign low relevance scores to all categories.


Sector: None (see reasoning)

The text does not discuss applications related to any particular sector, including politics, healthcare, or public services. It is strictly focused on procedures and standards for measurement instruments in the context of emissions, without intersecting with the defined sectors such as Government Agencies, Healthcare, or Private Enterprises. Thus, it is devoid of direct relevance to the sectors listed.


Keywords (occurrence): algorithm (1) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
Data Governance
Data Robustness (see reasoning)

The text addresses a specific retinal diagnostic software device that incorporates an adaptive algorithm for evaluating ophthalmic images. This directly relates to the category of Robustness due to the emphasis on algorithm development and performance benchmarks. It also fits into Data Governance because it discusses software verification, validation documentation, and the management of clinical performance data, ensuring the data used in the software is accurate and reliable. Social Impact is relevant to some extent, as the text touches on the application of technology to improve health outcomes, indirectly addressing the societal benefit of advancing healthcare through AI-driven diagnostics. System Integrity is less relevant since the focus is more on software performance rather than on security or transparency. Overall, the text primarily emphasizes the technical specifications and requirements for the AI-driven software device in a healthcare context, which aligns it more closely with Robustness and Data Governance.


Sector:
Healthcare (see reasoning)

This legislative text directly pertains to the healthcare sector, as it describes a retinal diagnostic software device intended for use in medical diagnostics. It addresses clinical performance metrics, user training, and the necessary evaluations for the software's effectiveness and safety, all of which are central to healthcare. While the implications of AI in business and regulations may hold some relevance, the primary focus on a healthcare application firmly categorizes this text under Healthcare. Other sectors are less applicable as they involve different contexts unrelated to medical diagnostics or AI applications in healthcare settings.


Keywords (occurrence): algorithm (2) show keywords in context

Collection: Code of Federal Regulations
Status date: Jan. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text provided does not pertain to AI technology or its applications in any direct manner. It focuses on the structure of data files related to the FDIC's operations, specifically about account identifiers, holds on funds, and data transmission aspects. There are no references to AI-driven technologies, algorithms, or data management practices that would link the text to any of the predefined categories concerning Social Impact, Data Governance, System Integrity, or Robustness. Hence, all categories are scored as not relevant.


Sector: None (see reasoning)

The text does not address any specific sectors related to politics, government services, healthcare, private enterprises, academic research, international cooperation, non-profits, or emerging sectors in relation to AI. Instead, it deals with banking data structures and processes that are largely unrelated to these sectors, so all sector scores reflect a complete lack of relevance.


Keywords (occurrence): automated (4) algorithm (3) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text pertains to donor testing regulations under the FDA, specifically addressing procedures for testing human specimens for communicable diseases. It does not mention AI, algorithms, or any related technologies. As such, the relevant categories like Social Impact, Data Governance, System Integrity, and Robustness are not applicable since the AI's influence on society, data management, system security, or performance benchmarks is not considered. The text may inform medical practices but does not involve any considerations about AI systems or their governance. Therefore, all categories are scored as 1: not relevant.


Sector: None (see reasoning)

The text outlines regulations for donor testing related to communicable diseases within healthcare. While this directly impacts healthcare practices, there is no mention of AI technologies influencing these regulations or their enforcement in any sector related to politics, government, or judicial systems. The text does not discuss AI implications for healthcare or any impact from AI technologies, leading to a score of 1 across the relevant sectors.


Keywords (occurrence): algorithm (2) show keywords in context

Collection: Code of Federal Regulations
Status date: Jan. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily focuses on the structure of the deposit-customer join file used by the FDIC, detailing the required fields and formats for customer and account information. It lacks explicit connections to social impact considerations of AI, data governance specific to AI systems beyond general data management, system integrity in terms of AI applications or oversight, and robustness in AI performance benchmarks. Therefore, while the text touches on general data management practices, it does not specifically address AI implications or governance. Thus, minimal relevance to the AI categories.


Sector: None (see reasoning)

The text discusses file structures involving customer data management for the FDIC. However, it does not explicitly relate to the sectors outlined, as it does not address the application of AI in politics, government services, healthcare, or other specified sectors. A slight relevance exists under 'Government Agencies and Public Services' due to its association with the FDIC, although AI-specific applications are not present within this context.


Keywords (occurrence): algorithm (2) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
System Integrity (see reasoning)

The text primarily discusses definitions relating to electronic prescriptions and controlled substances, emphasizing the framework around administering medications electronically. It does not explicitly address social impacts, governance of data, integrity of systems, or robustness benchmarks specific to AI. However, aspects such as biometric authentication and audit trails suggest minimal relevance to system integrity. Thus, while there are elements that could relate to AI, they do not sufficiently meet the broader legislative concerns of the categories, leading to lower scores across the board.


Sector:
Healthcare (see reasoning)

The text is largely regulatory regarding the definitions and applications related to electronic prescriptions and controlled substances, with very limited direct relation to any defined sector categories. While it mentions technology systems involved in the prescription process, it does not specifically target any one sector like Politics, Healthcare, or Government Services comprehensively. Consequently, it receives scores suggesting minimal relevance.


Keywords (occurrence): automated (1) algorithm (2) show keywords in context

Collection: Code of Federal Regulations
Status date: Jan. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text does not contain any explicit references to artificial intelligence or related technologies. It primarily focuses on energy consumption testing procedures for freezers and dishwashers without addressing any implications or regulations connected to AI systems, their impact on society, data governance, integrity, or benchmarks related to their robustness. Therefore, the relevance of this text to the defined categories of Social Impact, Data Governance, System Integrity, and Robustness is minimal.


Sector: None (see reasoning)

The text similarly lacks references to AI applications or regulations within specific sectors. It does not pertain to political activities, government services, judicial elements, healthcare, or any other sector characterized in the predefined categories. The focus is strictly on energy efficiency measures for household appliances. Consequently, all sector-related scores are similarly very low.


Keywords (occurrence): algorithm (3) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
Data Governance
System Integrity
Data Robustness (see reasoning)

The text discusses a coronary vascular physiologic simulation software device that uses algorithms to simulate blood flow and other physiological metrics. Given the focus on the predictive capability of algorithms and the evaluation of their performance against clinical data, it is relevant to all four categories. Specifically, it addresses social impact in terms of potential user harm, data governance due to the reliance on accurate data, system integrity as it requires software validation and verification, and robustness in terms of establishing performance benchmarks for AI systems used in healthcare applications.


Sector:
Government Agencies and Public Services
Healthcare
Academic and Research Institutions (see reasoning)

The legislation outlined in the text is particularly focused on the use of AI within healthcare for coronary vascular simulations. It details how AI algorithms affect clinical assessments and diagnoses, underlining importance in the healthcare sector. The strong connection to clinical decision-making and the intended use of this technology in medical settings directly correlates with healthcare legislation, while aspects of user evaluation and safety could apply to government agencies and public services as well.


Keywords (occurrence): algorithm (3) show keywords in context

Collection: Code of Federal Regulations
Status date: Jan. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance (see reasoning)

The text primarily focuses on the file structure for automated credit accounts linked to investment vehicles. While it mentions automated processes in the context of financial transactions, it does not explicitly pertain to the social impacts, data governance related to AI, system integrity of AI operations, or robustness in the context of AI performance metrics. The references to automated processes here are more administrative than about AI's interaction impacts, meaning it emphasizes efficiency and data handling rather than the ethical or impact-related implications of AI systems on society or governance.


Sector:
Private Enterprises, Labor, and Employment (see reasoning)

The text relates to the financial sector, particularly in dealing with automated processes relevant to credit accounts and investment. However, it doesn't specifically deal with legislative aspects or the influence of AI on business practices. While data governance is implicated in the secure handling of financial data, the lack of explicit references to AI application or regulatory oversight leads to a lower score than might typically be expected in a more direct discussion about AI in finance.


Keywords (occurrence): automated (11) algorithm (2) show keywords in context

Collection: Code of Federal Regulations
Status date: July 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text focuses on general recordkeeping provisions primarily related to emissions data and operational parameters for affected sources. It does not specifically discuss AI technologies or their social impact, data governance practices, system integrity, or robustness. Therefore, I consider it not relevant to any of the defined AI categories.


Sector: None (see reasoning)

The content of the text is focused on environmental regulations, monitoring procedures, and data management for emissions. There are no references made to AI applications or sectors that directly involve AI. Consequently, the relevance across all sectors is negligible.


Keywords (occurrence): automated (1) algorithm (3) show keywords in context

Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily discusses regulations related to encryption standards in communication systems as mandated by the Federal Communications Commission (FCC). It does not address the societal impacts or ethical implications of artificial intelligence, nor does it elaborate on the secure management of AI-related data, the integrity, or performance benchmarking of AI systems. Therefore, the categories of Social Impact, Data Governance, System Integrity, and Robustness are not relevant to the content presented in this text as it doesn't pertain to AI specifically.


Sector: None (see reasoning)

The content of the text does not specifically address the use or regulation of AI within any particular sector, including Politics and Elections, Government Agencies, Judiciary, Healthcare, Private Enterprises, Academic Institutions, International standards, Nonprofits, or any emerging sectors. It focuses solely on communication system standards and encryption directives. Thus, all sector categories receive the lowest score.


Keywords (occurrence): algorithm (1)

Collection: Code of Federal Regulations
Status date: Jan. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily concerns data structuring and management for customer accounts and does not discuss any elements directly related to AI or its implications on society, data management, system integrity, or robustness. Therefore, it does not directly pertain to any of the provided categories.


Sector: None (see reasoning)

The text focuses on the structure of customer data for the FDIC without mentioning the role or impact of AI in financial services, government services, or any other related sectors. While it touches upon data management practices, it does not have enough relevance to categorize under Data Governance or other sectors.


Keywords (occurrence): algorithm (4) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text provided is a legal and regulatory document associated with the listing of impairments, primarily emphasizing the medical criteria used by the Social Security Administration. However, it does not specifically address or integrate aspects of AI technologies. There are no mentions of keywords such as 'Artificial Intelligence', 'Algorithm', or any other AI-related terms. Therefore, all categories related to AI have no clear relevance since the text focuses on traditional medical evaluation and legislative procedures for managing disability claims. As a result, each category will receive the lowest possible score, showing no relevance to AI-related issues.


Sector: None (see reasoning)

The document focuses heavily on medical impairments and disability adjudication, indicating it pertains to the healthcare sector. Even though it holds legal significance, it does not involve the regulation or use of AI within any sector, including healthcare, politics, or public service. Consequently, all scores for the sectors will also be at the lowest scale as the document does not engage with AI applications or their implications.


Keywords (occurrence): automated (24) show keywords in context

Collection: Code of Federal Regulations
Status date: Jan. 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance
System Integrity (see reasoning)

The text focuses on the requirements and protocols related to the use of therapy-related computer systems in a medical context, particularly concerned with the acceptability and accuracy of treatment planning systems. The explicit mention of computer systems could indicate a relation to both system integrity and data governance, as the accuracy of algorithmic processes and the protocols affecting medical outcomes are central to this context. However, the lack of references to broader societal impacts limits the relevance to social impact. Additionally, while the focus on protocols implies concerns with system adherence and reliability, it does not extend to benchmarks for performance metrics, which affects its relevance to robustness.


Sector:
Healthcare (see reasoning)

The text is primarily concerned with medical applications of computer systems used in therapy, suggesting a strong relevance to the healthcare sector. The guidelines for treatment planning systems and requirements for qualification of medical personnel are crucial in a healthcare context. While there could be a slight nod to the role of governmental regulation in overseeing these standards, it doesn't explicitly connect to government agencies and public services beyond regulatory concerns for medical practice. Other sectors, such as politics, the judicial system, private enterprises, and academic institutions, are not directly referenced, limiting their relevance.


Keywords (occurrence): algorithm (1) show keywords in context
Feedback form