Job Description:
Native IOS Engineer with Mandatory Agentic AI experience
We're hiring an
iOS Engineer to join us as we explore the frontier of what AI agents can do on mobile devices. Your immediate focus will be
Phone Use research into enabling AI agents to perceive and act on iOS, analogous to Computer Use on desktop but the role is intentionally broader. Mobile is a long-term surface for us, and you'll help shape what we build next as the technology and our research questions evolve.
Experience Level: 3 5 Years
About The Role
You'll work closely with researchers and engineers to prototype, evaluate, and productionize iOS capabilities that push beyond conventional app development. Some of what you build will be experimental harnesses for research; some will be polished components that ship. The common thread is strong iOS fundamentals combined with the curiosity and rigor to operate at the edges of the platform.
Key Responsibilities
- Develop and maintain native iOS applications and internal tooling across the full app lifecycle development, testing, deployment, and distribution
- Partner with researchers on applied investigations into iOS capabilities, including screen perception, UI understanding, gesture synthesis, inter-app workflows, and agent-driven control of the device
- Explore and stress-test iOS frameworks relevant to agentic and assistive use cases Accessibility (AX), XCTest / XCUITest, ReplayKit, ScreenCaptureKit, App Intents, Shortcuts, Vision, and Core ML
- Identify the boundaries of what's achievable on stock iOS versus what requires entitlements, developer-mode devices, MDM, or specialized setups, and document trade-offs clearly
- Build evaluation harnesses, data collection pipelines, and integrations between iOS components and backend systems (model APIs, agent infrastructure)
- Contribute to platform work that may extend beyond agents over time performance, security, new device capabilities, and emerging Apple frameworks
- Track iOS releases, beta SDKs, and WWDC announcements; surface implications for current research and future product directions
- Collaborate with cross-functional teams researchers, ML engineers, designers, and backend engineers to turn ambiguous questions into working prototypes quickly
Required Skills
- Strong native iOS development experience in Swift (Objective-C familiarity a plus)
- Solid grasp of iOS app architecture, the app lifecycle, sandboxing, entitlements, and the iOS permission model
- Working knowledge of device and platform capabilities screen recording / capture, gesture simulation, notifications, background execution, media frameworks
- Experience with app publishing and distribution via the App Store, TestFlight, or enterprise / ad-hoc channels
- Good analytical and research instincts: able to take an ambiguous question, design experiments, and report findings clearly
- Ability to read Apple documentation, header files, and WWDC sessions, and to reason from first principles when documentation runs out
Immediately Useful
The work in front of us draws heavily on these areas. Strength in any of them is a meaningful plus:
- Accessibility APIs (UIAccessibility / AX), XCUITest automation, ReplayKit / ScreenCaptureKit, or App Intents
- UI understanding view hierarchies, screen parsing, OCR, or visual grounding
- On-device ML (Core ML, Vision) and integrating LLM / VLM APIs into mobile apps
- Prior work on automation, RPA, mobile testing frameworks, assistive technology, or accessibility tooling
- Exposure to Computer Use, browser agents, or other agentic AI systems