4162 results:


Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
System Integrity (see reasoning)

The text primarily focuses on automated payment processes such as statement processing and the Automated Clearinghouse (ACH), with no explicit mention or connection to the impact of AI on society as defined in the Social Impact category. The data governance aspect is also not directly touched upon, as the text deals mainly with payment processing and interactions with the U.S. Customs and Border Protection. There are elements that mention the verification of payment information, but they do not directly address the governance of data within AI systems. System Integrity concerns are relevant due to the mention of electronic payment systems and ensuring accurate information in payments; however, this aspect is quite specific to process integrity rather than holistic AI system security or transparency. Robustness is the least relevant here as the legislation does not discuss benchmarks or performance standards for AI systems. Overall, the text does not deeply engage with any of these categories but touches lightly on System Integrity due to its focus on electronic transaction processes, thus leading to further evaluation.


Sector:
Government Agencies and Public Services (see reasoning)

The text relates to Government Agencies and Public Services as it deals with processes involving the U.S. Customs and Border Protection and financial transactions associated with them. While there’s a mechanism that could be argued to emphasize efficiency in public service delivery, it does not highlight AI applications specifically in this context. Other sectors such as Politics and Elections, Judicial System, Healthcare, Private Enterprises, Labor, and Employment, Academic and Research Institutions, International Cooperation and Standards, Nonprofits and NGOs, and Hybrid, Emerging, and Unclassified are not mentioned or implied, making them irrelevant here. The text is primarily focused on electronic payment procedures without greater context on AI applications within these areas. Thus, Government Agencies and Public Services is selected based on its mention of CBP processes but not strongly supported by explicit AI context.


Keywords (occurrence): automated (6) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily discusses the regulatory requirements and classifications related to automated external defibrillators (AEDs) and other medical devices. It does mention automation in the context of AEDs which interpret ECG data and automatically deliver shocks. However, the focus is more on regulatory compliance and device classification rather than the broader social implications of AI, data management issues, system integrity concerns, or robustness assessment. The relevance to social impact is therefore minimal, as it does not consider the societal effects of deploying such devices but instead focuses on operational specifications. In terms of data governance, while there is a reference to data interpretation, it does not delve into the management or governance of data related to these devices. Regarding system integrity and robustness, the references to performance standards are not specific to certifications or audits relevant to AI systems, leading to limited relevance across all categories.


Sector:
Healthcare (see reasoning)

The text details regulations concerning the use of automated external defibrillators (AEDs) but does not explicitly discuss their role in politics, judicial processes, public service enhancement, healthcare management, workplace applications, academic institutions, or international cooperation. However, because it pertains to medical devices, it has limited relevance to the healthcare sector where AI applications might be increasingly integrated into emergency medical systems. The mentions of potential data processing point somewhat towards healthcare but do not strongly anchor the text in that space. Thus, it has some relevance to healthcare due to the medical nature of the devices but lacks a broader application to other sectors.


Keywords (occurrence): automated (4) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text discusses various automated medical devices used in transfusion and blood testing processes. While the mention of automation implies the integration of technology, it does not specifically address the ethical, social, or governance aspects that are critical to the categories of Social Impact, Data Governance, System Integrity, and Robustness. Therefore, no category captures the essence of this text particularly well. The relevance to these categories is low.


Sector:
Healthcare (see reasoning)

The text primarily relates to healthcare as it describes automated devices used in medical diagnostics and blood collection. The scope of the legislation involves medical technology that could potentially leverage AI for efficiency but does not explicitly mention or elaborate on AI technologies. Nevertheless, the applicability to healthcare practices is direct, making it the most relevant sector. Other sectors do not align with the content of the text.


Keywords (occurrence): automated (7) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
System Integrity (see reasoning)

The text primarily discusses medical devices and their classifications, specifically focusing on automated blood cell separators and related technologies. The relevance of each category can be analyzed as follows. Social Impact: Given that the legislation outlines automated blood cell separators, it indirectly relates to the social implications of healthcare technology, especially in terms of life-saving procedures. However, it does not directly address issues of societal impact, fairness, or psychological harm, leading to a score of 3. Data Governance: The text touches on the classification of medical devices, which requires accurate data management but lacks explicit details on data governance for the AI components involved in automation, yielding a relevance score of 2. System Integrity: The mention of a controlled classification for the automated blood cell separator device correlates with maintaining system integrity; however, the text does not provide direct measures for transparency or security, leading to a score of 3. Robustness: The text does not address the performance standards or benchmarks specific to AI in the context of automated medical devices, so the relevance is limited, scoring a 2.


Sector:
Healthcare (see reasoning)

The text discusses automated blood cell separators, which have direct applications in the healthcare field. This aligns with the Healthcare sector as it deals with devices used in diagnostics and blood component separation. Other sectors such as Politics and Elections, Government Agencies and Public Services, Judicial System, Private Enterprises, Labor, and Employment, Academic and Research Institutions, International Cooperation and Standards, Nonprofits and NGOs, and Hybrid, Emerging, and Unclassified do not apply directly or at all to this specific text since it primarily focuses on medical devices. Therefore, Healthcare receives a score of 5, while all others score 1, indicating no relevance.


Keywords (occurrence): automated (7) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily discusses automated medical devices used for blood analysis. There is no explicit mention of AI technologies or their implications in the context of social impact, data governance, system integrity, or robustness. While the term 'automated' is present, it refers to mechanized processes rather than any intelligent system that uses AI. Thus, none of the categories are relevant in this context.


Sector: None (see reasoning)

The text details the classification and description of various automated medical devices rather than discussing the application of AI in a relevant sector. There is no indication that AI plays a role in these devices or their regulatory framework. Therefore, all sectors, including politics and elections as well as healthcare, associated with AI development and application do not apply here since the focus is solely on automated devices without any AI context.


Keywords (occurrence): automated (8) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance (see reasoning)

The text discusses automated platelet aggregation and other related machines used in medical diagnostics. Specifically, the mention of automated systems for platelet aggregation and sedimentation indicates the role of technology in enhancing healthcare. However, the text does not delve into broader implications, accountability, or systemic issues that AI might introduce. Thus, while it is relevant to healthcare, the focus is more on devices rather than the social impact or governance of AI in healthcare contexts. Hence, the relevance scores are somewhat mixed. The relevance to Social Impact is low because there's no mention of societal issues or regulations regarding these technologies. Data Governance is moderate due to the need for secure and accurate data collection in healthcare, but the text does not directly address these issues. System Integrity is slightly relevant because there is an implied need for control and security in automated devices, but specifics are lacking. Lastly, Robustness is slightly relevant, as it confines itself to performance standards of devices but lacks a detailed discussion on benchmarks or regulatory compliance for AI technologies in this context.


Sector:
Healthcare (see reasoning)

The text primarily references automated devices used in medical diagnostics and treatment monitoring within the healthcare sector, indicating a significant relationship with AI applications in this area. Although it discusses automated processes, it lacks the direct mention of AI technologies such as machine learning or neural networks, which would strengthen its connection to healthcare AI regulations. However, it fits into the Healthcare sector due to its discussions of automated devices for blood tests and related diagnostics. Other sectors such as Politics and Elections, Government Agencies and Public Services, Judicial System, Private Enterprises, Labor and Employment, Academic and Research Institutions, International Cooperation and Standards, Nonprofits and NGOs, and Hybrid, Emerging, and Unclassified either do not apply directly or lack context in the given text.


Keywords (occurrence): automated (8) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
System Integrity (see reasoning)

The text mostly details automated medical devices which includes automated processes for cell counting and analysis in the healthcare field. It specifically mentions several automated devices such as automated cell counters and automated differential cell counters which use various methods for blood analysis. Given this context, the relevance of each category can be evaluated as follows: 1. **Social Impact**: This category is moderately relevant (3). While the document discusses automated devices that have implications for healthcare quality, disparities, and potentially for patients’ psychological and physical effects, it does not delve deeply into these aspects, which is crucial to solidify a higher relevance. 2. **Data Governance**: This category is slightly relevant (2). The document touches on devices that collect patient data but lacks an explicit focus on legislation related to data management, privacy concerns, or biases within AI datasets. 3. **System Integrity**: It is moderately relevant (3). The text mentions performance standards and classification of medical devices, suggesting a level of oversight and control over these systems. However, there isn't a strong emphasis on security measures or specific oversight frameworks, which lowers the overall relevance. 4. **Robustness**: The relevance here is likely very low (1). Although the devices must meet performance standards, there is no mention of new benchmarks, regulatory compliance, or auditing processes outlined in the definition of robustness. Thus, it doesn't meet the category's criteria.


Sector:
Healthcare (see reasoning)

The text primarily pertains to automated devices used in healthcare, particularly in laboratory settings for blood analysis. The relevance of each sector can be evaluated as follows: 1. **Politics and Elections**: Not relevant (1). The text does not address political applications or implications of AI. 2. **Government Agencies and Public Services**: Slightly relevant (2). While it discusses devices regulated by the FDA, it does not explore broader implications for government services beyond regulation and oversight of medical devices. 3. **Judicial System**: Not relevant (1). There is no mention of AI use in judicial contexts or legislation. 4. **Healthcare**: Extremely relevant (5). The text is centered on automated medical devices utilized in healthcare settings, directly aligning with this sector’s focus. 5. **Private Enterprises, Labor, and Employment**: Slightly relevant (2). If the devices are considered in the context of medical device manufacturing and labor, there is a marginal link, but it is not explicitly discussed. 6. **Academic and Research Institutions**: Not relevant (1). There are no references to education, research, or academic studies. 7. **International Cooperation and Standards**: Not relevant (1). The document does not discuss international standards or cooperation that is relevant to AI. 8. **Nonprofits and NGOs**: Not relevant (1). There is no mention of AI applications within nonprofit contexts. 9. **Hybrid, Emerging, and Unclassified**: Not relevant (1). The text does not present innovative or hybrid solutions that would fit this category.


Keywords (occurrence): automated (25) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text focuses primarily on automated devices used in hematology, particularly in counting and classifying blood cells. However, while it mentions terms like 'automated differential cell counter,' it does not deeply delve into broader societal impacts, data governance challenges, system integrity needs, or robustness considerations specifically related to AI in the context of legislation. The use of the term 'automated' suggests a level of AI engagement, but without explicit discussion of accountability, ethical implications, or benchmarks for performance, its relevance to comprehensive AI governance categories is limited.


Sector:
Healthcare (see reasoning)

The text pertains to the automated and semi-automated devices used in medical settings, specifically in terms of blood analysis. Its relevance to the healthcare sector is strong, but it lacks specificity regarding governmental regulations, ethical considerations, or operational impact. While it may imply usage in healthcare settings, it does not elaborate on the implications of AI technologies in these settings beyond the mechanical functions of the devices themselves. Therefore, it holds a noteworthy connection to healthcare but does not address broader sector-specific issues.


Keywords (occurrence): automated (14) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
Data Governance
System Integrity
Data Robustness (see reasoning)

The text pertains primarily to the regulation and oversight of medical devices specifically designed for glycemic control, which may entail the use of AI technologies for managing insulin delivery and glucose monitoring. It emphasizes performance verification, secure data transmission, and safety, aligning with concerns around the social impact of AI in health settings, maintaining system integrity in critical applications, and the governance of data used by AI systems. Notably, it discusses measures that ensure devices operate safely and effectively, reflecting strong relevance to robustness and system integrity. However, the focus on technical specifications and performance oversight reveals a lesser emphasis on broader societal concerns, leading to a high but not absolute relevance for social impact.


Sector:
Healthcare
Academic and Research Institutions (see reasoning)

The text is highly relevant to the Healthcare sector since it concerns devices that monitor and control glycemic levels, which directly impacts patient healthcare outcomes. Discussion includes evaluation of clinical performance and usability studies, emphasizing the application's critical role in health management. The safety and validation aspects of AI simulations within medical areas also link it to factors regarding AI’s role in healthcare settings. Other sectors are not as relevant because the text focuses narrowly on medical devices rather than broader implications for sectors like politics or labor.


Keywords (occurrence): automated (3) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
System Integrity
Data Robustness (see reasoning)

The text predominantly discusses the requirements for automated trade surveillance systems in the context of designated contract markets, focusing on compliance and monitoring responsibilities within a regulatory framework. It highlights the necessity of such systems to detect and investigate violations in trading practices, thereby connecting closely to the integrity, accountability, and responsibility concerning AI use in automated systems. The references to automated systems, monitoring capabilities, and compliance staff highlight the social impact in terms of accountability and the need for ethical AI use in financial transactions. However, the document focuses more on systems integration and compliance rather than broader social implications or ethical considerations, thus placing more weight on System Integrity and Robustness. AI automation's role is a core element, but the wider implications on society as a whole are less pronounced in this specific context.


Sector:
Government Agencies and Public Services (see reasoning)

The text's content is primarily concerned with the regulatory implications of AI-driven automated trade surveillance in financial markets. It outlines the compliance standards and operational frameworks that market regulators (designated contract markets) must adhere to, which directly relates to Government Agencies and Public Services as it affects how these agencies enforce regulations. The emphasis on compliance and effectiveness of automated systems showcases its relevance particularly to those working within government securities and market practices. The other sectors, while they may touch upon AI's applications in wider contexts, do not find as pronounced relevance here as financial regulation is the primary focus.


Keywords (occurrence): automated (5) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
System Integrity
Data Robustness (see reasoning)

The text discusses provisions surrounding automated vessels and the use of automated equipment in longshore work. It mentions specifically the operation of automated self-unloading conveyor belt and vacuum-actuated systems, which directly relates to the automation aspect of AI. However, there is no mention of AI systems themselves such as machine learning or deep learning technologies applying to these processes. The discussion is closely tied to employment regulations and practices rather than AI's broader societal implications or operational integrity needs. Thus, Social Impact and System Integrity are relevant but not prominently so, while Robustness aligns slightly because the text implies benchmarks or standards for practice but does not explicitly cover performance metrics for AI systems. Data Governance appears minimally relevant due to oversight in attestation requirements without extensive data management frameworks explicitly detailed. Overall, the relevance of each category is moderate with specific leanings towards Social Impact due to its implications in labor practices and employment legislation.


Sector:
Government Agencies and Public Services
Private Enterprises, Labor, and Employment (see reasoning)

The text is primarily focused on issues surrounding the use of automated vessels in longshore work. This has significant implications for the maritime sector within labor regulations and practices. The automated vessel sections are not directly tied to political campaigns, healthcare, or nonprofits but instead align with the intersection of employment practices with emerging technologies in public services. The use of AI technology in employment settings makes the text relevant in Private Enterprises, Labor, and Employment, while also marginally touching on Government Agencies and Public Services due to regulatory frameworks around these practices. Thus, those sectors are rated more prominently, while other sectors like Judicial System and Academic Institutions appear irrelevant in this context.


Keywords (occurrence): automated (13) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance
System Integrity
Data Robustness (see reasoning)

The text predominantly discusses automated devices used in the analysis and processing of blood samples, particularly those that may operate using AI technologies or algorithms for counting and classifying blood cells. This directly relates to the category of 'Robustness' as it may touch upon standards for device performance, which could involve AI benchmarks. The 'System Integrity' category is also relevant as these devices might require human oversight, security protocols, and assurance of performance integrity to ensure safety and accuracy in medical contexts. 'Data Governance' is relevant because the performance and reliability of these AI-driven devices depend heavily on the accurate handling of medical data, ensuring biases and inaccuracies are managed. However, there is no explicit mention of societal impacts or legislative measures to guard against AI-driven discrimination, hence 'Social Impact' is less relevant. Overall, 'Robustness' and 'System Integrity' received higher relevance scores than the others.


Sector:
Healthcare (see reasoning)

The text mainly involves AI-related devices used in healthcare settings for analyzing blood samples. The discussion of automated blood cell diluting apparatus and related laboratory devices is clearly within the sphere of healthcare technology. Therefore, the 'Healthcare' sector is very relevant (5), as these devices play a crucial role in diagnostics and medical procedures. There are no mentions directly related to politics, the judicial system, government operations, academic research, international standards, or nonprofits, hence the scores for those sectors are low or not applicable. The application of automation in healthcare is emphasized; as such, the relevance to healthcare applications among public services must be acknowledged as well, but the primary focus remains firmly within healthcare.


Keywords (occurrence): automated (14) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text refers to an 'automated medium dispensing and stacking device' which indicates a level of automation and potentially implies the use of algorithms in its operation. However, it does not explicitly discuss the social impact of such automation or offer commentary on issues like accountability, bias, or consumer protection, which are essential elements of the Social Impact category. Likewise, while the use of automation is mentioned, the focus seems more on device specifications and regulations rather than data governance issues such as data accuracy or privacy. There is no clear discussion on system integrity regarding human oversight or security measures that would fit the System Integrity category. The text does not mention benchmarks for performance or regulatory compliance; thus, it lacks the necessary factors to support Robustness. Overall, the legislation primarily deals with the classification and operational framework for medical devices rather than directly addressing the categories provided.


Sector: None (see reasoning)

While the text relates to medical devices used in healthcare, it does not specifically address legislation or discussion surrounding the use of AI in healthcare. The focus is primarily on definitions, classifications, and regulations of various medical devices rather than their integration with AI technologies, which would be required for scoring in the Healthcare sector. Furthermore, the text does not delve into AI's implications for labor, public services, election processes, judiciary, academia, or any emerging sectors. Therefore, it does not meet the criteria to be relevant to any of the specified sectors.


Keywords (occurrence): automated (2) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance
System Integrity (see reasoning)

The text explicitly details the management and security of records systems, which includes both manual and automated records. Although the terms 'automated' and 'automated systems' are used, there is no direct mention of AI technologies or related terminologies such as machine learning, algorithms, or neural networks. The focus is primarily on data security, privacy, and access control, which pertains to data governance rather than machine intelligence or AI's societal implications. Therefore, while some principles might overlap with the broader impacts of AI, the text does not specifically engage with AI as a technology.


Sector:
Government Agencies and Public Services (see reasoning)

The text discusses security measures regarding personal records that can be automated, implying some use of technology in handling data. However, it does not delve into sector-specific applications such as healthcare or education. Its relevance is more aligned to governance concerning data privacy and security, hence correlating well with the Data Governance sector as it encompasses the management of personal data within automated systems. The position regarding governmental oversight of data security could extend to government agencies, but its application is more general than specific to government operations.


Keywords (occurrence): automated (4) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily discusses the specifics of medical device regulation, focusing on an intravascular administration set with automated air removal capabilities. While some references to automation (like 'automated air removal system') could imply aspects of AI in decision-making or operation, there is no explicit mention of AI technologies or methodologies such as machine learning or algorithms. The references made in the text do not delve into social impacts directly, governance of data, system integrity, or robustness pertaining to AI. Therefore, I consider it not relevant for these AI categories.


Sector: None (see reasoning)

The text pertains specifically to medical devices regulated by the FDA. It does not directly address AI applications in healthcare beyond mentioning automation in the context of an intravascular device. Because of the lack of a focus on AI's role in healthcare systems or clinical decision-making processes, it does not fit within the predefined sectors of legislation for politics, government, judiciary, healthcare, private enterprises, academia, international cooperation, nonprofits or NGOs. It is mostly technical and regulatory detail about a specific medical device legislation, thus I assign low relevance scores.


Keywords (occurrence): automated (2) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The text primarily discusses the classification of various medical devices, including an automated cell-washing centrifuge for immuno-hematology. While automation is mentioned in terms of device functionality, it does not explore broader social implications, regulatory aspects surrounding fairness or bias in AI systems, robust governance of data, or integrity of systems as defined by the categories. Therefore, relevance is low for all categories.


Sector: None (see reasoning)

The text discusses the classification of medical devices, particularly automated devices, but lacks details on the regulation or application of AI specifically in health settings. The focus is rather on device approvals rather than AI implications in healthcare. Thus, relevance for each sector is limited.


Keywords (occurrence): automated (7) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Data Governance
System Integrity (see reasoning)

The text pertains primarily to the classification and regulation of an automated blood grouping and antibody test system. While it references software performance and verification, it does not explicitly discuss the social impact of AI, data governance, system integrity, or robustness in detail. The passage focuses more on technical classifications and specifications, which do not directly concern AI's societal implications, data handling, or performance benchmarking, leading to a moderate level of relevance in these categories.


Sector:
Healthcare (see reasoning)

The text mainly focuses on medical devices, specifically in the context of blood grouping and testing. Given that the automated system is applied in a healthcare context and involves aspects of software verification and validation related to patient safety, it shows a significantly relevant connection to the Healthcare sector. Other sectors, such as Politics and Elections, Nonprofits and NGOs, and others, do not find direct relevance in this context.


Keywords (occurrence): automated (2) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
Societal Impact
Data Governance (see reasoning)

The text includes descriptions of automated fluorescence in situ hybridization (FISH) enumeration systems and other related automated diagnostic tools. It explicitly references automated processes and customized software specifically for FISH assays, which falls under AI's domain in healthcare diagnostics. This indicates a significant relevance to both the social impact of AI—specifically in clinical settings and its implications—and the governance of data within such systems. It does not, however, directly specify standards or benchmarks for robustness or system integrity in a detailed manner, limiting its relevance in those areas.


Sector:
Healthcare
Academic and Research Institutions (see reasoning)

The text specifically pertains to the use of AI systems within healthcare through the automation of diagnostic processes such as FISH assays. It涉及到如何保证这些系统的正确和安全使用。虽然提到了一些用于图像分析和结果生成的算法和软件,但并未详细探讨使用这些系统的法律、政策或合规性影响。认为在医疗健康部门中的应用比其他部门更为明确和相关。


Keywords (occurrence): automated (6) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category: None (see reasoning)

The provided text discusses a fully automated antimicrobial susceptibility system used in clinical diagnostics that incorporates technology to assess bacterial susceptibility. This document is quite specifically about medical technology rather than broader implications of AI systems. Each section outlines the performance characteristics and regulatory requirements of devices used for antimicrobial testing; however, it does not engage with legislation about the impacts of AI practices, data governance or security surrounding these systems' algorithms or decision-making. Therefore, the relevance of this text to the 'Social Impact,' 'Data Governance,' 'System Integrity,' and 'Robustness' categories is very low.


Sector:
Healthcare (see reasoning)

This text is highly relevant to the Healthcare sector since it deals with a medical device designed to detect antimicrobial susceptibility. It outlines regulatory requirements and performance metrics related to clinical settings, which is predominant in healthcare discussions. However, while it touches on aspects of technology aligned with AI or automated systems, it primarily does not discuss AI in a direct regulatory or legislative context. Thus, the scoring is skewed toward relevance to healthcare over AI-specific legislation.


Keywords (occurrence): automated (2) show keywords in context

Collection: Code of Federal Regulations
Status date: April 1, 2023
Status: Issued
Source: Office of the Federal Register

Category:
System Integrity (see reasoning)

The text primarily outlines requirements for automated, mechanical, or electronic equipment concerning the manufacturing, packaging, and handling of dietary supplements. It details procedures for the proper calibration, maintenance, and operational oversight of these systems, but does not explicitly address the social implications or regulatory aspects of AI technologies. Thus, while there is relevance to automation, this text does not address broader social impact issues, data governance, system integrity, or robustness in the context of AI as defined in the category descriptions. It talks about the operational and sanitary standards of equipment rather than data management or public concerns, leading to lower relevance scores in the categories.


Sector:
Healthcare (see reasoning)

The text pertains mainly to the operational protocols regarding automated equipment used in the manufacturing of dietary supplements. While it discusses control measures that could apply to AI systems (like the need for quality control and regular checks), it does not explicitly address specific sectors like politics, healthcare, or any of the other defined categories. System integrity stands out because it refers to the safety and functionality of these automated systems, which is vital in ensuring compliance with operational procedures in manufacturing. Other sectors like academia, nonprofits, or international standards are not mentioned, leading to minimal relevance across most sectors.


Keywords (occurrence): automated (4) show keywords in context
Feedback form