In Short
USCIS evaluates O-1A and EB-1A petitions using a criteria-based framework that includes a threshold evaluation of evidence, followed by a final merits determination.
USCIS policy guidance in recent years has clarified that STEM professionals, including AI researchers and open-source contributors, may present non-traditional evidence such as published code, model cards, benchmark results, and peer-reviewed contributions.
Well-organized evidence portfolios structured around the regulatory criteria tend to reduce Requests for Evidence (RFEs) and streamline preparation timelines.
Artificial intelligence and open-source ecosystems have become central to how extraordinary ability is demonstrated in 2026. Candidates working on large language models, machine learning infrastructure, or widely adopted open-source projects often hold achievements that do not fit the traditional mold of journal publications or patents. The regulatory criteria, however, remain the same as when they were drafted: original contributions of major significance, scholarly articles, judging the work of others, and so on. The practical challenge is translating modern forms of technical output such as GitHub repositories, model releases, benchmark results, and conference workshops into a structured evidence portfolio that maps cleanly onto the existing framework.
This article outlines how USCIS typically evaluates such evidence in 2026, what documentation is commonly submitted, and how a technology-assisted evidence coordination process can help keep everything aligned. It is descriptive, not prescriptive: decisions about legal strategy and eligibility belong to independent licensed immigration attorneys.
The Regulatory Framework: What Has and Has Not Changed
The statutory basis for extraordinary ability petitions has not changed. For O-1A visa nonimmigrant petitions, the criteria are set out in 8 CFR 214.2(o)(3)(iii). For EB-1A green card immigrant petitions, the criteria appear in 8 CFR 204.5(h)(3).
The EB-1A regulation sets out ten categories of evidence, including awards, memberships, published material, judging, original contributions, authorship, critical employment, and high remuneration. USCIS then proceeds to a final merits determination after evaluating whether at least three criteria are met. This two-step framework applies to both O-1A and EB-1A petitions.
What has evolved is how USCIS guidance applies these categories to STEM fields. Recent USCIS Policy Manual guidance provides examples relevant to STEM professionals and clarifies how adjudicators evaluate modern forms of evidence in areas such as AI, machine learning, and open-source development. The underlying regulations remain unchanged, but adjudicators increasingly assess non-traditional outputs such as open-source contributions, model releases, and large-scale technical systems.
Two-Step Adjudication
Step 1: Categorical count: the officer determines whether the evidence satisfies at least three regulatory criteria.
Step 2: Final merits determination: the officer evaluates the evidence as a whole to decide whether the record demonstrates sustained national or international acclaim.
Source: Kazarian v. USCIS, 596 F.3d 1115 (9th Cir. 2010); adopted in USCIS Policy Manual.
Why AI and Open-Source Profiles Require a Different Documentation Approach
A traditional research profile is documented through peer-reviewed journals, citations indexed in Scopus or Web of Science, grant awards, and conference proceedings. AI and open-source careers often look different. A senior machine learning engineer may have authored only a handful of arXiv preprints but released a model downloaded hundreds of thousands of times. A compiler maintainer may have no journal articles but thousands of merged pull requests across projects used inside major cloud platforms. These contributions are real and verifiable, they simply do not arrive in the format the regulations originally anticipated.
USCIS guidance addresses this distinction. USCIS Policy Manual guidance provides examples suggesting that in STEM fields, published material may include preprints on recognized repositories and that original contributions of major significance may be demonstrated through widely adopted tools, frameworks, or datasets. The practical implication is that an evidence portfolio must do the translation work: it must show the adjudicator both the artifact and the context that makes the artifact significant.
Evidentiary Criteria, Mapped to Modern Technical Output
The table below summarizes how evidence commonly generated by AI and open-source practitioners tends to align with the regulatory categories. It reflects categories of documentation observed in approved petitions and does not constitute legal advice.
|
Regulatory Criterion |
Traditional Evidence | AI / Open-Source Equivalents |
|---|---|---|
| Awards | Nobel, Turing, national medals | Best paper awards at NeurIPS, ICML, ICLR; hackathon prizes from recognized organizations; open-source foundation awards |
| Membership | Elected fellowships requiring outstanding achievement | Invited core maintainer status on major projects; program committee seats at top-tier venues |
| Published material about the candidate | Feature articles in major newspapers | Coverage of the candidate’s work in recognized technology press; interviews in established industry outlets |
| Judging | Journal peer review, PhD committees | Reviewer for NeurIPS, ICML, CVPR; PR review responsibility on flagship open-source repositories |
| Original contribution of major significance | Landmark patents, cited breakthroughs | Widely adopted models, datasets, frameworks; architectural contributions cited in follow-on research |
| Authorship of scholarly articles | Journal papers | Peer-reviewed conference papers, arXiv preprints with substantial citation footprints |
| Critical employment | Leadership at distinguished organizations | Tech lead or principal roles at organizations with documented distinguished reputation |
| High remuneration | Top-of-field compensation | Compensation documentation benchmarked against BLS or verifiable survey data |
Source: 8 CFR 214.2(o)(3)(iii) (O-1A); 8 CFR 204.5(h)(3) (EB-1A); USCIS Policy Manual (uscis.gov/policy-manual)
Building the Evidence Portfolio: A Structural View
An evidence portfolio is not simply a collection of documents. In approved petitions, the record is typically organized so that each criterion has its own section, each section opens with a short narrative explaining what is being shown, and every exhibit has a clear provenance and a reason for inclusion. The goal is to let an adjudicator move from claim to evidence to context without having to reconstruct the story themselves.
Layer 1: The Artifact
The artifact is the primary source: the paper, the model card, the GitHub repository, the patent, the reviewer invitation email. For digital artifacts, dated screenshots combined with archival links (for example, from web.archive.org) help establish that the artifact existed in its described form at a specific point in time. Commonly submitted artifacts include release tags, DOI records for preprints, and official download statistics provided directly by the hosting platform.
Layer 2: The Context
Context translates the artifact into significance. For an AI model release, this may include independent benchmarks, adoption data, and references in follow-on research. For an open-source contribution, it may include the project’s position in its ecosystem, the candidate’s commit history relative to other contributors, and statements from project leadership. Context is where adjudicators typically look during the final merits step.
Layer 3: The Expert Letters
Expert letters (sometimes called recommendation letters) remain one of the most influential components of the record. In approved petitions, letters from independent experts (not current employers or close collaborators) tend to carry the most weight. Commonly, letters explain the writer’s qualifications, describe the candidate’s specific contribution, and place that contribution within the broader field. Form letters or letters that simply restate a résumé tend to receive less weight.
Illustrative Scenario (Composite)
A research engineer contributed the attention optimization used in a widely downloaded open-source language model. The portfolio included: the original pull request with commit metadata, the release notes crediting the contribution, independent benchmarks showing measurable speedups, three expert letters from unaffiliated researchers quantifying the impact, and two follow-on papers citing the technique. The evidence was organized under the original contribution of major significance criterion, with cross-references to authorship and judging.
This scenario is illustrative and does not describe a specific case.
Common Documentation Gaps Observed in 2026
Most RFEs don’t come from weak profiles. They come from documentation gaps that could have been caught earlier. Here’s what to watch for.
- Undated digital artifacts. GitHub pages and model cards change over time. Without archival snapshots, an adjudicator may question when a claimed contribution actually existed.
- Citation counts without context. A raw number is less persuasive than a comparison to typical citation rates within the same subfield at the same career stage.
- Expert letters from affiliated writers. Letters from current managers or co-founders are often given less weight than letters from independent experts.
- Conflating code authorship with original contribution. Being listed as a contributor is not the same as authoring a contribution of major significance; the record needs to show which specific commits or design decisions are attributed to the candidate.
- Missing translation layer for technical concepts. Adjudicators are not expected to be machine learning specialists. Short, plain-language explanations of why a result matters are commonly included in approved petitions.
Evidence Portfolio Preparation Checklist
The checklist below reflects documentation commonly observed in prepared petitions and is intended for informational purposes only.
| Curriculum vitae with dated entries for every claimed achievement | |
| Complete publication list with DOIs, venue tier, and citation counts where available | |
| Archived snapshots of digital artifacts (repositories, model cards, release notes) | |
| Download, usage, or adoption statistics obtained directly from hosting platforms | |
| Independent benchmark results, with source documentation | |
| Peer review and program committee invitations, with dates and venue details | |
| Expert letters from unaffiliated independent reviewers | |
| Media coverage with publication dates and outlet descriptions | |
| Compensation documentation benchmarked against verifiable salary data | |
| A cross-reference index mapping each exhibit to one or more regulatory criteria |
Assembling a portfolio of this scope is primarily a project management problem. Dozens of exhibits must be collected from multiple sources, organized, versioned, and handed to independent attorneys for legal review. PassRight provides administrative tools to organize documentation and coordinate evidence for review by independent licensed attorneys. This division of responsibilities, administrative coordination on one side, legal review on the other, tends to reduce duplicated work. Candidates spend their time gathering the underlying evidence; attorneys spend their time on legal judgment rather than document logistics.
Conclusion
Extraordinary ability petitions for AI and open-source professionals in 2026 still follow the same O-1A and EB-1A rules. What has changed is the type of evidence USCIS now accepts and how important it is to present modern technical work in a way that fits those rules.
Most candidates handle this by clearly organizing their achievements and working with an immigration attorney to present them effectively.
Frequently Asked Questions
Does USCIS accept GitHub contributions as evidence of extraordinary ability?
USCIS Policy Manual guidance recognizes that contributions to widely used open-source projects may be submitted as evidence under the original contribution of major significance criterion, and potentially under authorship or critical employment. The record typically needs to show both the specific contribution and its significance within the field.Are arXiv preprints treated the same as peer-reviewed journal articles?
USCIS guidance on STEM evidence notes that preprints on recognized repositories may qualify as scholarly articles, particularly when the work has been cited in peer-reviewed venues. The weight given to a preprint typically depends on factors such as citation footprint and the standing of the venue. A licensed immigration attorney can advise on how preprints fit within a specific record.How many citations are typically needed?
There is no regulatory threshold. Adjudicators evaluate citation counts in context, the candidate’s subfield, career stage, and typical citation patterns of comparable researchers. A smaller number of citations in a highly specialized area may carry more weight than a larger number in a broad field.Can downloads or GitHub stars replace traditional publication metrics?
They can serve as supporting context under the original contribution criterion, especially when combined with independent benchmarks, adoption by recognized organizations, and references in follow-on research. Raw counts alone are rarely sufficient; the portfolio typically explains what the numbers mean relative to comparable projects.What is the difference between O-1A and EB-1A evidence requirements?
EB-1A applies ten regulatory criteria and requires a higher standard of sustained national or international acclaim at the very top of the field. O-1A applies eight regulatory criteria and requires demonstrating extraordinary ability, but is a nonimmigrant classification with a lower evidentiary threshold than EB-1A.How long does portfolio preparation typically take?
Working with PassRight portfolio preparation often takes 6–8 weeks.
Need help with your case? Schedule a call with our customer care team. They’ll be happy to discuss your needs and connect you with an immigration attorney.