4162 results:
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text pertains primarily to regulations related to tariffs in the context of maritime transportation, and contains procedural details regarding the publication of tariffs and interactions between the Federal Maritime Commission and carriers. The language used does not discuss AI technologies or their impacts directly. As a result, the relevance to the categories is minimal, and scores are low.
Sector: None (see reasoning)
The text does not address specific sectors such as politics, healthcare, or education, as it is focused on maritime commerce regulations. While the mention of automated systems hints at technological processes, it does not pertain specifically to AI applications within any of the given sectors, leading to very low relevance scores.
Keywords (occurrence): automated (2) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Data Governance
System Integrity
Data Robustness (see reasoning)
The text discusses requirements for periodically unattended machinery plants, emphasizing automation and self-regulating systems. This relates to AI systems as they often embody automation principles, suggesting relevance to all categories. However, the text lacks direct references to AI technologies or ethical considerations surrounding their impact, data management, or system integrity, limiting its applicability in broader discussions on AI governance. While automation plays a role in societal structures, the primary focus here appears to be on safety and efficiency of machinery operation, leading to a lower relevancy score for Social Impact and robustness but a higher score for Data Governance due to references to control and monitoring distinctions.
Sector:
Government Agencies and Public Services (see reasoning)
The text does not directly reference specific sectors despite its details on operational requirements for machinery, which includes automated controls. The focus remains technical rather than applied within a specific sector context such as healthcare or public safety. Thus, it scores lower in sector relevance, particularly as it does not specifically address the use of AI across defined economic or social sectors, but it shows some alignment with Government Agencies and Public Services due to the context of marine operations and safety regulations.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
System Integrity (see reasoning)
The text primarily details requirements for machinery plants, particularly with respect to control, monitoring, and safety systems. While it mentions 'automated' systems, these references do not delve into the ethical or societal implications of AI technologies, which is critical for the Social Impact category. Regulatory measures find a stronger representation in System Integrity due to the focus on automation, monitoring, and emergency handling systems, but they do not explicitly address transparency or human oversight essential for this category. For Data Governance, there's no focus on data management, privacy, or biases in datasets, thus making it irrelevant. The text aligns moderately with the Robustness category due to its technical specifications that may imply performance benchmarks, but it lacks explicit mention of auditing or certifications, placing it at a lower relevance level.
Sector:
Government Agencies and Public Services (see reasoning)
The legislation describes the operational requirements for automated machinery plants, particularly in maritime contexts. This ties into Government Agencies and Public Services because it is concerned with regulations that govern safety and operational procedures but does not focus on other public service innovations that involve AI. While it may touch on job safety and technical operational standards relevant to Private Enterprises, Labor, and Employment, the primary focus on machinery does not sufficiently connect to the illustrative employment impacts of AI technologies. Other sectors like Politics and Elections, Healthcare, and International Cooperation and Standards bear no explicit links to AI in this context.
Keywords (occurrence): automated (2) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Societal Impact
System Integrity
Data Robustness (see reasoning)
The text pertains to the requirements for automated systems in a maritime context, specifically looking at safety controls and the capabilities of these systems to replace personnel or reduce crew size. Due to the focus on safety, reliability, and operational considerations in automated environments, the following assessments can be made regarding social impact, data governance, system integrity, and robustness. The systematic oversight of automated vessels aligns closely with both the social impact and system integrity categories because it involves considerations of safety and accountability for decision-making systems that directly affect human life and vessel operations. However, it doesn’t explicitly address data governance or the need for new benchmarks for performance, which affects the robustness category.
Sector:
Government Agencies and Public Services
Private Enterprises, Labor, and Employment (see reasoning)
The text discusses regulations and standards revolving around automated systems in a nautical context which directly impacts government operations and safety measures. This applies primarily to the 'Government Agencies and Public Services' sector since it deals with U.S. Coast Guard regulations. The relevance to 'Private Enterprises, Labor, and Employment' is also noted because it speaks to the implications of automation on crew requirements in commercial shipping. However, as it does not touch on healthcare, judiciary, or international cooperation, those sectors are rated as less relevant. Other sectors such as Politics and Elections, Academic and Research Institutions, Nonprofits and NGOs, and Hybrid, Emerging, and Unclassified also do not pertain to the content of this text.
Keywords (occurrence): automated (4) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
This text outlines regulations related to automated control systems, focusing on safety and operational integrity. However, there is no direct reference to AI technologies such as machine learning, deep learning, or algorithms that learn from data, which would indicate a relevance to the AI landscape. The text primarily discusses automation and the structural requirements for control systems without delving into AI or its implications. Given that terms like 'algorithm' or 'AI' are missing, the connection to the categories feels limited. Therefore, all category scores will be low.
Sector: None (see reasoning)
The content mentions automated systems in the context of control systems for vital operations, particularly in marine applications. It does not specifically identify AI's role in these systems but instead emphasizes safety protocols and manual overrides. Consequently, the connection to sectors focusing on AI's application in politics, public services, healthcare, etc., is minimal. Therefore, all sector scores reflect a lack of direct relevance to the text's content.
Keywords (occurrence): automated (3) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
System Integrity
Data Robustness (see reasoning)
The text primarily deals with standards related to automation and alarm systems within marine environments. It touches on the design, operation, and verification of automated systems, which could relate to the robustness and integrity of such systems. However, it lacks specific references to AI or its associated technologies. The absence of terms like 'Artificial Intelligence', 'Machine Learning', or other specified keywords indicates that the legislation may not directly address the implications of AI but rather focuses on operational standards for automated systems that might utilize technology that does not explicitly involve AI.
Sector: None (see reasoning)
The text predominantly pertains to standards for automated vital systems and does not reference specific sectors such as healthcare, government services, or education directly. While it may apply to broader sectors related to marine operations and safety protocols, there is no explicit focus on the predefined sectors typically associated with AI legislation. Thus, it does not align strongly with any specific sector description.
Keywords (occurrence): automated (5) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
System Integrity (see reasoning)
The text primarily discusses safety control systems and automatic alarms in the context of operational safety for maritime machinery. The relevance to AI is not direct as it does not specifically mention concepts or terminology related to Artificial Intelligence or machine learning, but it does include automation and automatic systems which are conceptually similar. However, since it does not delve into AI specifics like algorithms or machine learning techniques, the strong focus appears to be on safety regulations rather than the implications of AI. Therefore, the relevance of this text is limited in terms of the broader implications of AI systems on social impact, data governance, system integrity, or robustness. As a result, I rate the categories lower.
Sector: None (see reasoning)
The text operates within maritime safety regulations related to automation and alarm systems. While it does touch on control systems and their operation, it lacks direct relevance to any of the sectors like Politics and Elections or Healthcare, among others, as it doesn’t mention AI applications in these areas. The closest alignment might be with Government Agencies and Public Services due to its regulatory nature, but the mention of AI is tenuous at best. Hence, these scores reflect the limited applicability of the text to the sectors defined.
Keywords (occurrence): automated (1) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Societal Impact
System Integrity (see reasoning)
The text primarily discusses safety control systems in automated environments, particularly in maritime settings. It touches on aspects of self-certification, safety control, manual alternates, and the overall requirements for automated systems. The direct references to automation systems could suggest relevance to the Social Impact category, as there are implications for safety and potential impacts on individuals involved in these systems. However, the text does not focus explicitly on issues such as fairness, bias, or the social consequences of AI. Data Governance could be moderately relevant given the context of data management required for safety and compliance, but it lacks explicit references to data accuracy or privacy regulations. System Integrity is quite relevant as the text details the need for safe operational standards and control systems that ensure safety from failures, which speaks to the inherent security of AI systems. Lastly, the section on performance reaching new benchmarks could be linked to Robustness, but there’s insufficient emphasis on developing benchmarks in the context of AI. Thus, System Integrity scores the highest relevance among categories.
Sector:
Government Agencies and Public Services (see reasoning)
While the text itself doesn’t specifically address sectors such as politics or healthcare, the emphasis on automated safety systems may hint at implications for public services particularly in emergency management and safety recommendations. However, the details don't provide strong enough context to score highly in any of the provided sectors as they seem to be too generalized. Government Agencies and Public Services might be the closest relevant sector since it deals with regulations in the maritime context, but without more specific policies, it remains at a moderate relevance score. The text has little to no direct connection with Judicial System, Healthcare, or the other defined sectors. Thus, most sectors receive low scores due to the lack of specificity or strong relevance.
Keywords (occurrence): automated (4) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily contains information regarding automated control systems relevant to maritime safety. However, it does not directly address social implications of AI, data governance, system integrity, or robustness in the context of AI. While automation is discussed, the focus on manual control systems, environmental design standards, and safety does not explicitly link to social impact or the governance of data in AI systems. Therefore, the relevance across the categories is low.
Sector: None (see reasoning)
The text outlines requirements for control systems and automation within maritime operations but does not primarily focus on sectors defined. It mentions aspects that could relate to government agencies and public services, particularly with respect to safety regulations, but the overall context is not specific to the use of AI or its regulation in key sectors such as politics, healthcare, or academic institutions. The connection to sectors like Private Enterprises or Nonprofits is also tenuous. Therefore, the relevance across sectors is fairly low.
Keywords (occurrence): automated (3) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily deals with safety control systems and automated requirements for specific machinery and systems in maritime applications, specifically the control of boilers and propulsion systems. The focus is more on operational safety and regulations rather than the social implications, data privacy, or performance benchmarks typically associated with AI discussions. The keywords associated with AI are absent, indicating that the text does not directly engage with AI systems or their societal impacts. As such, the categories do not align closely with the content presented in the text.
Sector: None (see reasoning)
The text references automated systems in the context of marine operations. However, it does not delve into the implications of AI usage in government operations or specific agencies, nor does it indicate a direct impact on public services. The primary focus is on safety requirements and procedural actions rather than legislation specifically directed towards AI in any sector. Thus, while 'Government Agencies and Public Services' may have some tangential relevance due to the mention of regulatory requirements, it remains minimal.
Keywords (occurrence): automated (4) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
System Integrity (see reasoning)
The text primarily delineates technical requirements for 'Automated Vital Systems' concerning safety, reliability, and independence in operations. It discusses conditions under which these systems must operate, emphasizing failsafe mechanisms and the necessity for independent power sources. The critical aspects of system reliability discussed, pertaining to failures and operational independence, suggests relevance to the 'System Integrity' category as it addresses the inherent security and oversight measures mandatory for these automated systems. However, the text does not explicitly relate to social impact, data governance, or robustness in AI performance benchmarks. Thus, less relevance is attributed to those categories.
Sector: None (see reasoning)
The text does not bill specific sectors related to Politics, Government, or Health sectors. It indirectly touches on aspects relevant to 'Government Agencies and Public Services,' given the context may apply to systems involved in public safety, yet it lacks direct references to these sectors. Overall, there is minimal information about the application of AI in sectors distinctively outlined.
Keywords (occurrence): automated (5) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
System Integrity
Data Robustness (see reasoning)
The text primarily focuses on the technical specifications and safety criteria related to automated vital systems, particularly within the context of vessel management. Although it references automated systems, it does not address their social impact, data governance, integrity, or challenges in performance assessment directly. It does, however, mention elements such as being failsafe, independent operations, and testing protocols, which point towards a concern for system integrity and the robustness of automated systems, albeit indirectly. As a result, I would regard the relevance of the categories as low overall, given the specific focus on technical operational standards rather than broader societal implications or governance issues.
Sector:
Government Agencies and Public Services (see reasoning)
The text mainly pertains to automated systems used in a specific context, which seems most relevant to government agencies and public services due to its focus on regulations surrounding the operation of automated systems in vessels. Although it lacks a direct connection to political systems or judicial applications, the emphasis on public safety and operational reliability connects it to public service delivery. The healthcare sector does not apply here, nor does the academic sector, as there’s no mention of educational institutions or healthcare research. Overall, the text is clearly linked to operational standards valuable for government agencies but less so for other sectors.
Keywords (occurrence): automated (5) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Data Governance
System Integrity
Data Robustness (see reasoning)
The text discusses automated systems, primarily focusing on safety, reliability, and testing protocols. However, it does not explicitly cover any societal impacts of AI or any direct influences these automated systems may have on individuals or broader societal dynamics. Therefore, the relevance to Social Impact is low. For Data Governance, while there are mentions of control systems and potential failures, it lacks specifics on data management practices or regulations that ensure data accuracy or privacy, leading to a moderate relevance. System Integrity is more relevant as the text outlines safety and operational standards for automated systems. It emphasizes necessary testing and independence of control systems, relevant to integrity in AI systems. Lastly, Robustness is also relevant as the text pertains to the reliability and failsafe features essential to assessing performance benchmarks of automated systems. Overall, the text aligns more closely with the System Integrity and Robustness categories, with lower relevance to others.
Sector:
Government Agencies and Public Services
Private Enterprises, Labor, and Employment (see reasoning)
The text relates primarily to safety and operational standards for automated systems, particularly in a nautical context under the Coast Guard regulations. It does not address AI's use in Political processes, Government operations, the Judicial System, or Healthcare settings directly. However, it emphasizes the importance of safety protocols, which could be relevant to Government Agencies and Public Services where such automation may apply. Private Enterprises similarly may find relevance in the automated system's standards as they may use such guidelines in their operational systems. The relevance to Academic Institutions could hinge on research emerging from the application of these standards. Overall, while it does not fit neatly into any single sector, it has potential intersectional relevance, primarily in Government and Private Enterprise contexts.
Keywords (occurrence): automated (5) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
System Integrity (see reasoning)
The text primarily discusses the procedures for record management and training for railroad employees related to hours of service. There are references to automated systems and electronic records, which indicate a reliance on systematized processes but do not directly engage with AI technologies. Thus, while there is a mention of 'automated recordkeeping systems,' it does not seem to invoke a strong connection to specific issues of social impact or the governance of AI as defined in the categories. Automation is addressed generally, and the text is more about compliance and data management rather than contemporary AI challenges. Overall, the text lacks explicit relevance to AI in terms of its broader implications on society or governance, particularly in fairness, accountability, or integrity guidelines around AI-based systems.
Sector:
Government Agencies and Public Services (see reasoning)
The text does not mention the use of AI in any government services directly. It discusses the procedural aspects of maintaining records and training employees but does not explicitly involve AI applications within government operations or legislation affecting these sectors. The mention of an automated system for record-keeping could imply a primitive form of AI systems management, but it is not explicit or substantial enough to make a strong connection to the defined sectors. Therefore, it does not address sectors such as Politics and Elections, Government Agencies and Public Services, or other specified categories clearly related to AI.
Keywords (occurrence): automated (7) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
System Integrity (see reasoning)
The text outlines the procedures and requirements for automated recordkeeping systems in the rail industry, emphasizing system integrity, security measures, and proper access protocols. While there are mentions of automated systems, the primary focus is on the management and integrity of records rather than addressing the broader social implications of AI or establishing governance over data. Additionally, the discussion around access and auditing protocols is more aligned with the functionality and reliability of systems rather than performance benchmarks or robustness metrics. Given this context, the relevance of the categories is as follows: For Social Impact, there is minimal discussion about societal effects or consumer protections; for Data Governance, while records management is mentioned, the emphasis is primarily on access rather than governance over the data itself; for System Integrity, there is a significant focus on security and access control, indicating a high level of relevance; Robustness is somewhat less relevant, as it does not specifically outline benchmarks or standards for performance in AI systems. Overall, the System Integrity category is the most relevant based purely on the text's focus on secure and reliable procedures for recordkeeping which indirectly relates to the functioning of automated systems. Other categories hold less relevance, with Social Impact being the least relevant overall.
Sector:
Government Agencies and Public Services (see reasoning)
The text most directly relates to the sector of Government Agencies and Public Services through its regulations and requirements concerning automated recordkeeping in the rail industry, specifically outlining procedures that state inspectors must adhere to when accessing service records. Additionally, it could touch upon Judicial System elements through the requirement for audit trails and the integrity of records, although this is less explicit. There is limited relevance for other sectors, as the text does not address healthcare, private enterprises, or any educational contexts. Given this perspective, Government Agencies and Public Services receive the highest score, while others receive scores of 1 due to a lack of direct relation to their specific sector actions.
Keywords (occurrence): automated (4) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
Societal Impact
Data Governance
System Integrity (see reasoning)
The text primarily discusses automated recordkeeping in the context of railroads, which includes references to automation and data management systems. While it mentions criteria for an automated recordkeeping system, it lacks discussion on broader social issues, the integrity of AI systems, or robust performance standards typically associated with AI legislation. Therefore, the category that best fits is Social Impact, which indirectly touches upon the implications of automation on workers and recordkeeping practices. However, the core focus is on practical implementations rather than a socio-political critique of AI. Data Governance is also relevant due to the mention of secure recordkeeping and the management of employee data but lacks emphasis on broader data governance principles. While there is some discussion of system security (related to System Integrity) and automated processes (reflecting Robustness), the importance of these topics is not elaborated in detail within the legislative context presented.
Sector:
Government Agencies and Public Services
Private Enterprises, Labor, and Employment (see reasoning)
The text is focused on railroad regulations and the implementation of automated systems for recordkeeping. It primarily concerns the operations of the railroad industry rather than broader applications of AI in sectors such as politics, healthcare, or judicial practices. However, it has implications for Government Agencies and Public Services by discussing requirements related to compliance and recordkeeping for a specific industry. The discussion on system security and data management slightly touches upon the usage of AI-like technologies in managing records, hinting at a relationship with Private Enterprises, Labor, and Employment, but this connection is not explicitly detailed in the text.
Keywords (occurrence): automated (6) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text does not explicitly discuss AI systems, algorithms, or related technologies. It focuses on procurement policies and general requirements for major systems, major real property systems, and research and development systems within the context of federal contracting. There are no sections that indicate an impact of AI on society, data governance, system integrity, or performance benchmarks required for AI systems. The terms associated with AI like automated information systems are used but without direct implications about AI technologies themselves. Therefore, the text is mostly focused on administrative and procedural aspects rather than the AI-related legislative concerns outlined in the categories.
Sector: None (see reasoning)
The text primarily deals with contracting policies and thresholds for major systems within the federal procurement process. There are no references to AI applications within specific sectors, such as politics, healthcare, or government services. It does not discuss any implications or regulatory frameworks concerning AI use or applications in these sectors. Hence, it is determined to be not relevant to any specific sector.
Keywords (occurrence): automated (2) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily discusses regulations surrounding the unattended operation of maritime communication systems, particularly those involving automatic transmitter operations and the technical specifications required for such operations. While it mentions automated functions, it does not explicitly discuss AI systems or their social implications, data governance, system integrity, or robustness in the context of AI. As a result, the relevance to AI categories is minimal. The focus remains on operational and technical standards rather than on the broader impacts or governance associated with AI technologies.
Sector: None (see reasoning)
The text addresses protocols related to maritime communication, indicating an automated use of transmitters and technical standards. However, it does not specifically pertain to any of the predefined sectors like politics, healthcare, or judicial systems. The mention of automated systems in maritime operations implies a degree of regulation but doesn't delve into how these systems might operate within the broader context of AI applications in sectors such as public services, government agencies, or enterprises. Hence, the relevance to specific sectors is very low.
Keywords (occurrence): automated (2) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The provided text focuses on standard operating procedures for the processing of special permits and approvals under the Hazardous Materials Regulations. It does not explicitly reference AI or related technologies such as algorithms or automation in significant ways. The mention of automated safety profile reviews suggests a connection to automated systems, but this is limited and primarily about improving efficiency in the permit evaluation process rather than establishing any concrete impact of AI technologies. Therefore, the relevance within the context of the categories is minimal, as overall, the legislation does not engage with substantive aspects of social impact, data governance, system integrity, or robustness related to AI.
Sector:
Government Agencies and Public Services (see reasoning)
The text is a procedural guideline for handling applications related to hazardous materials. It does not directly address any application of AI in specific sectors. While there might be a minimal link to government processes, the text focuses primarily on procedural matters without discussing the specific implications of AI technologies in sectors such as politics or healthcare. There is no mention of how AI may influence these processes or sectors, thus warranting low relevance across all assessed sectors.
Keywords (occurrence): automated (9) show keywords in context
Collection: Code of Federal Regulations
Status date: Oct. 1, 2023
Status: Issued
Source: Office of the Federal Register
The text primarily focuses on the administration and procedures of the Defense Acquisition Regulations, with no relevant references to AI technologies or applications. There are mentions of automated systems regarding financial and procurement processes, but these do not connect explicitly to AI concepts nor do they address the broad implications of AI as outlined in the provided categories. Therefore, all categories receive low relevance scores, as the text does not engage meaningfully with any aspect of AI legislation or its implications.
Sector: None (see reasoning)
The text does not touch on any specific sector that uses or regulates AI. It discusses procurement and regulatory processes but lacks any linkage to sectors such as healthcare, government services, or others where AI is actively regulated or applied. Therefore, all sectors are deemed not relevant to the contents of the text.
Keywords (occurrence): automated (1) show keywords in context