<<

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

UNITED STATES PATENT AND TRADEMARK OFFICE

BEFORE THE PATENT TRIAL AND APPEAL BOARD

NIANTIC, INC., Petitioner

v.

NANT HOLDINGS IP, LLC, Patent Owner

U.S. Patent No. 10,403,051 B2 Issue Date: September 3, 2019

Title: Interference Based Hosting Platforms

PETITION FOR INTER PARTES REVIEW OF U.S. PATENT NO. 10,403,051 B2

TABLE OF CONTENTS

Page

I. MANDATORY NOTICES UNDER §42.8(A)(1) ...... 1 A. Real Party-In-Interest under §42.8(b)(1) ...... 1 B. Related Matters under §42.8(b)(2) ...... 1 C. Lead and Back-Up Counsel under §42.8(b)(3) ...... 1 D. Service Information ...... 3 E. Power of Attorney ...... 3 II. FEE PAYMENT ...... 4 III. REQUIREMENTS UNDER §§ 42.104 AND 42.108 AND CONSIDERATIONS UNDER §§ 314(A) AND 325(D) ...... 4 A. Grounds for Standing under 37 C.F.R. § 42.104(a) ...... 4 B. Identification of Challenge Under 37 C.F.R. § 42.104(b) and Statement of Precise Relief Requested ...... 4 C. Considerations under §§ 314 and 325(d) ...... 5 D. 37 C.F.R. § 42.108(c) ...... 5 IV. OVERVIEW OF THE ’051 PATENT ...... 6 A. Level of Ordinary Skill in the Art ...... 6 B. Specification Overview ...... 6 V. CLAIM CONSTRUCTION ...... 10 VI. THE CHALLENGED CLAIMS ARE UNPATENTABLE ...... 11 A. Summary and Date Qualification of the Prior Art ...... 12 1. Yu [Ex. 1003] ...... 12 2. Sanz-Pastor [Ex. 1004] ...... 14 3. Mullen [Ex. 1005] ...... 16 B. Ground 1: Single-Reference Obviousness of Claims 1, 5-9, 18, 22-23, 27-29, 34, 36, 38, 43 Over Yu ...... 18 1. Claim 1 ...... 18

-i-

TABLE OF CONTENTS (continued) Page

2. Claim 5: “The system of claim 1, wherein the AR server is a remote server coupled with the AR capable device via a wireless network.” ...... 43 3. Claim 6: “The system of claim 1, wherein the relevant AR object is caused to be rendered based on a position of the AR capable device relative to the environment.” ...... 45 4. Claim 7: “The system of claim 1, wherein the relevant AR object is caused to be rendered based on an orientation of the AR capable device relative to the environment.” ...... 45 5. Claim 8: “The system of claim 1, wherein the relevant AR object is caused to be rendered within an AR game.”...... 46 6. Claim 9: “The system of claim 1, wherein the relevant AR object is caused to be rendered by superimposing a visual image of at least one relevant AR object over an image of the environment.” ...... 47 7. Claim 18: “The system of claim 1, wherein the presence of the relevant AR object is altered to include a non- visible presence.” ...... 48 8. Claim 22: “The system of claim 1, wherein the determination of whether to alter presence of the relevant AR object depends on a time.” ...... 49 9. Claim 23: “The system of claim 22, wherein the presence alteration of the relevant AR object changes with the time.” ...... 50 10. Claim 27: “The system of claim 1, wherein the AR server is further configured to enable the AR capable device to populate the environment with at least one of the relevant AR objects.” ...... 51 11. Claim 28: “The system of claim 1, wherein the environment comprises an AR environment.” ...... 52

-ii-

TABLE OF CONTENTS (continued) Page

12. Claim 29: “The system of claim 28, wherein the AR environment includes real-world elements and virtual elements.” ...... 54 13. Claim 34: “The system of claim 1, wherein the AR server is further configured to enable the AR capable device to have an interaction with the rendered relevant AR object.” ...... 54 14. Claim 36: “The system of claim 1, wherein the mobile device is a cell phone.” ...... 54 15. Claim 38: “The system of claim 1, wherein the mobile device is a tablet computer.” ...... 55 16. Independent claim 43 ...... 55 C. Ground 2: Obviousness of Claims 1, 5-11, 15, 18, 22-23, 27- 29, 34, 36, 38, and 43 Over Yu in View of Sanz-Pastor ...... 57 1. Independent Claim 1 ...... 58 2. Claim 10: “The system of claim 1, wherein the relevant AR object is caused to be rendered within an overlapping augmented reality among multiple AR capable devices.” ...... 64 3. Claim 11: “The system of claim 10, wherein the overlapping augmented reality comprises a team-based augmented reality.” ...... 65 4. Claim 15: “The system of claim 1, wherein the relevant AR object is caused to be rendered according to a haptic format.” ...... 65 5. Dependent Claims 5-9, 18, 22-23, 27-29, 34, 36, and 38, and Independent Claim 43 ...... 66 D. Ground 3: Obviousness of Claims 1, 5-11, 15, 18, 22-29, 34- 36, 38, 43 Over Yu in View of Sanz-Pastor, in Further View of Mullen...... 67 1. Claim 8: “The system of claim 1, wherein the relevant AR object is caused to be rendered within an AR game.”...... 67

-iii-

TABLE OF CONTENTS (continued) Page

2. Claim 24: “The system of claim 1, wherein the AR server is further configured to enable the AR capable device to conduct a commercial transaction with a commerce engine.” ...... 69 3. Claim 25: “The system of claim 24, wherein the commercial transaction includes an exchange of virtual currency.” ...... 70 4. Claim 26: “The system of claim 24, wherein the commercial transaction includes a transfer of real-world funds.” ...... 71 5. Dependent claim 35 ...... 71 VII. CONCLUSION ...... 76

-iv-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2 List of Exhibits

Ex. No. Description of Document

1001 U.S. Patent No. 10,403,051 (“’051 patent” or “’051”)

1002 Declaration of Dr. Michael Zyda (“Zyda”)

1003 U.S. Patent Application Publication No. 2010/0066750 A1 to Yu et al. (“Yu”)

1004 U.S. Patent Application Publication No. 2007/0242131 A1 to Sanz- Pastor et al. (“Sanz-Pastor”)

1005 U.S. Patent Application Publication No. 2006/0105838 A1 to Mullen (“Mullen”)

1006 Reserved

1007 Reserved

1008 Excerpts from Oliver Bimber and Ramesh Raskar, Chapter 1: A Brief Introduction to Augmented Reality, in Spatial Augmented Reality: Merging Real and (2005)

1009 Paul Milgram and Fumio Kishino, A Taxonomy of Visual Displays, IEICE Transactions on Information Systems, E77-D (12):1321-29 (1994)

1010 Jean-Marie Normand et al., A new typology of augmented reality applications, AH ’12 Proceedings of the 3rd Augmented Human Int’l Conference (Mar. 2012)

1011 U.S. Patent No. 3,050,870 (“Heilig”)

1012 Ivan E. Sutherland, A head-mounted three dimensional display, AFIPS ’68 (Fall, Part I): Proceedings of the December 9-11, 1968 Fall Joint Comput. Conf., Part I, pp. 757-64 (1968)

‐i‐

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2 List of Exhibits

Ex. No. Description of Document

1013 Grégory Maubon, A little bit of history from 2006: Nokia MARA project, https://www.augmented-reality.fr/2009/03/un-petit-peu- dhistoire-de-2006-projet-mara-de-nokia/

1014 U.S. Patent No. 6,553,310 (“Lopke”)

1015 U.S. Patent Application Publication No. 2002/0044152 (“Abbott”)

1016 Excerpts from Lester Madden, Professional Augmented Reality Browsers for Smartphones, Programming for junaio, Layar, and Wikitude (2011)

1017 Jonathan Raper et al., Applications of location-based services: a selected review, J. Location Based Servs., 1(2):89-111 (2007)

1018 Paul G. Savage, Blazing Gyros: The Evolution of Strapdown Inertial Navigation Technology for Aircraft, J. of Guidance, Control, and Dynamics, 36(3):637-55 (2013)

1019 Jeong Won Kim, A Step, Stride and Heading Determination for the Pedestrian Navigation System, J. Global Positioning Sys., 3(1-2):273- 79 (2004)

1020 Press Release, Apple Reinvents the Phone with iPhone (Jan. 9, 2007), https://www.apple.com/newsroom/2007/01/09Apple-Reinvents-the- Phone-with-iPhone/

1021 U.S. Patent No. 8,745,494 (“Spivak”)

1022 U.S. Patent No. 5,848,373 (“DeLorme”)

1023 Michael R. Macedonia et al., Exploiting Reality with Multicast Groups, (1995)

1024 U.S. Patent No. 6,882,933 (“Kondou”)

1025 Reserved

‐ii‐

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2 List of Exhibits

Ex. No. Description of Document

1026 U.S. Patent No. 9,317,133 (“Korah”)

1027 Duy-Nguyen Ta et al., SURFTrac: Efficient Tracking and Continuous Object Recognition using Local Feature Descriptors, 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2937- 44 (2009)

1028 Carsten Magerkurth et al., Pervasive Games: Bringing Computer Entertainment Back to the Real World, ACM Comput. In Entm’t, 3(3):1-19 (July 2005)

1029 Bruce Thomas et al., ARQuake: An Outdoor/Indoor Augmented Reality First Person Application, IEEE Int’l Symposium on Wearable Comput., pp. 139-46 (2000)

1030 Bruce Thomas et al., First Person Indoor/Outdoor Augmented Reality Application: ARQuake, Personal and , 6:75-86 (2002)

1031 U.S. Patent Application Publication No. 2004/0164897 (“Treadwell”)

1032 Michael Zyda, From Visual Simulation to Virtual Reality to Games, IEEE, 38(9):25-32 (Sept. 2005)

1033 Michael Zyda, Creating a Science of Games, Commc’ns of the ACM, 50(7):27-29 (July 2007)

1034 Reserved

1035 Reserved

1036 Reserved

1037 Reserved

1038 Excerpts from NantWorks’ Disclosure of Asserted Claims and Infringement Contentions and Ex. 1, dated 1/28/2021, served in

‐iii‐

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2 List of Exhibits

Ex. No. Description of Document Nantworks, LLC v. Niantic, Inc., Case No. 3:20-cv-06262-LB (N.D. Cal.)

1039 Excerpts from Microsoft Corporation, Microsoft Computer Dictionary (2002)

1040 Reserved

1041 Reserved

1042 Patent L.R. 4-3 Joint Claim Construction and Prehearing Statement, ECF No. 61, dated 5/18/2021, filed in Nantworks, LLC v. Niantic, Inc., Case No. 3:20-cv-06262-LB (N.D. Cal.)

1043 Brett Zarda, EA Games to Incorporate Real-Time Weather, WIRED (Aug. 17, 2007), https://www.wired.com/2007/08/madden/

1044 Michael Zyda et al., Designing a Massively Multiplayer Online Game/Research Testbed Featuring AI-Driven NPC Communities, Proceedings of the Sixth AAAI Conf. on Artificial Intel. & Digit. Ent., pp. 108-11 (2010)

1045 U.S. Patent Application Publication No. 2008/0262910 (“Altberg”)

1046 Proof of Service of Summons and Complaint, ECF No. 12, dated Sept. 8, 2020, filed in Nantworks, LLC v. Niantic, Inc., Case No. 3:20-cv-06262-LB (N.D. Cal.)

1047 Third Amended Complaint for Patent Infringement, ECF No. 50, dated Feb. 23, 2021, filed in Nantworks, LLC v. Niantic, Inc., Case No. 3:20-cv-06262-LB (N.D. Cal.)

‐iv‐

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

This is a petition for inter partes review of claims 1, 5-11, 15, 18, 22-29, 34-

36, 38, and 43 of U.S. Patent No. 10,403,051 (Ex. 1001) (“’051 Patent” or “’051”)).

I. MANDATORY NOTICES UNDER §42.8(A)(1)

A. Real Party-In-Interest under §42.8(b)(1)

Niantic, Inc. (“Niantic” or “Petitioner”) is the real party-in-interest to this IPR

Petition.

B. Related Matters under §42.8(b)(2)

The ’051 Patent is the subject of pending litigation involving Petitioner:

NantWorks, LLC, and Nant Holdings IP, LLC v. Niantic, Inc., Case No. 3:20-cv-

06262-LB (N.D. Cal.). No trial date has been set.

Petitioner was first served on September 8, 2020. (Ex. 1046.) The Third

Amended Complaint in that action alleges that Niantic infringes the ’051 Patent.

(Ex. 1047, ¶¶1, 20-34, 62-86.)

C. Lead and Back-Up Counsel under §42.8(b)(3)

Petitioner provides the following designation of counsel.

LEAD COUNSEL BACK-UP COUNSEL

Heidi L. Keefe (Reg. No. 40,673) Lauren J. Krickl (Reg. No. 70,261) [email protected] [email protected]

COOLEY LLP COOLEY LLP ATTN: Patent Group ATTN: Patent Group 1299 Pennsylvania Ave. NW, Suite 700 1299 Pennsylvania Avenue NW Washington, DC 20004 Suite 700 Tel: (650) 843-5001 Washington D.C. 20004

-1-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

LEAD COUNSEL BACK-UP COUNSEL

Fax: (650) 849-7400 Tel: (650) 843-5065 Fax: (650) 849-7400 Dena Chen (Admission pro hac vice to be requested) [email protected]

COOLEY LLP ATTN: Patent Group 1299 Pennsylvania Ave. NW, Suite 700 Washington D.C. 20004 Tel: (650) 843-5135 Fax: (650) 849-7400 Alexandra D. Leeper (Admission pro hac vice to be requested) [email protected]

COOLEY LLP ATTN: Patent Group 1299 Pennsylvania Avenue NW Suite 700 Washington D.C. 20004 Tel: (650) 843-5376 Fax: (650) 849-7400 Patrick W. Lauppe (Admission pro hac vice to be requested) [email protected]

COOLEY LLP ATTN: Patent Group 1299 Pennsylvania Avenue NW Suite 700 Washington D.C. 20004 Tel: (650) 843-5833 Fax: (650) 849-7400

-2-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

D. Service Information

This Petition is being served by Federal Express to the attorneys of record for the ’051 Patent, Mauriel Kapouytian Woods LLP, 15 W. 26th Street, Floor 7, New

York, NY 10010. This Petition is also being served on litigation counsel for Patent

Owner1 identified in the Certificate of Service. Petitioner consents to electronic

service at the addresses provided above for lead and back-up counsel.

E. Power of Attorney

Filed concurrently per 37 C.F.R. § 42.10(b).

1 The assignment records for the ’051 Patent show Nant Holdings IP, LLC as the sole Patent Owner. This is consistent with NantWorks’ assertion in the underlying litigation that Nant Holdings IP “solely owns the Asserted Patents,” with

NantWorks, LLC as “the exclusive licensee of the Asserted Patents.” (Ex. 1047,

¶19.) To the extent any ambiguity exists regarding whether NantWorks, LLC is an owner of the ’051 Patent, Petitioner is serving this Petition on counsel of record for

Nant Holdings IP, LLC and NantWorks, LLC (collectively, “NantWorks”).

Petitioner reserves the right to correct any clerical error.

-3-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

II. FEE PAYMENT

This Petition requests review of claims 1, 5-11, 15, 18, 22-29, 34-36, 38, and

43 of the ’051 patent. A payment of $44,875 is submitted herewith. See 37 C.F.R.

§ 42.15(a). This Petition meets the fee requirements of 35 U.S.C. §312(a)(1).

III. REQUIREMENTS UNDER §§ 42.104 AND 42.108 AND CONSIDERATIONS UNDER §§ 314(A) AND 325(D)

A. Grounds for Standing under 37 C.F.R. § 42.104(a)

Petitioner certifies that the ’051 Patent is available for IPR and that Petitioner is not barred or otherwise estopped.

B. Identification of Challenge Under 37 C.F.R. § 42.104(b) and Statement of Precise Relief Requested

Petitioner requests institution of IPR based on the following grounds:

Ground Claims Basis for Challenge under §103(a)

1, 5-9, 18, 22- 1 23, 27-29, 34, Yu (Ex. 1003) 36, 38, 43

1, 5-11, 15, 18, 2 22-23, 27-29, Yu (Ex. 1003) and Sanz-Pastor (Ex. 1004) 34, 36, 38, 43

1, 5-11, 15, 18, Yu (Ex. 1003), Sanz-Pastor (Ex. 1004), and Mullen 3 22-29, 34-36, (Ex. 1005) 38, 43

Submitted with this Petition is the Declaration of Dr. Michael Zyda (Ex. 1002)

(“Zyda”), a qualified technical expert. (Zyda, ¶¶1-9, Ex. A.)

-4-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

C. Considerations under §§ 314 and 325(d)

§314(a): Petitioner is aware of no prior petitions challenging the ’051

Patent. Petitioner filed this IPR petition shortly after the parties’ joint claim construction brief was filed on May 18, 2021, and before NantWorks’ opening claim construction brief is scheduled to be filed (July 2, 2021). The Board should exercise its discretion to institute this inter partes review under §314(a) because the litigation is in its relatively early stages, no trial date has been set, and there is no substantial

overlap of the issues and arguments presented in this Petition with those presented

at the district court. See Apple v. Fintiv, IPR2020-00019, Paper 15 (P.T.A.B.

May 13, 2020). For example, this Petition challenges six claims not asserted in

district court, and presents grounds for obviousness that rely on primary prior art

references not asserted in district court.

§325(d): None of the prior art references relied upon in Petitioner’s Grounds were cited during prosecution of the ’051 Patent. Accordingly, the challenges presented in this Petition rely on new art and grounds of unpatentability different from any presented during prosecution.

D. 37 C.F.R. § 42.108(c)

The Board should institute inter partes review of claims 1, 5-11, 15, 18, 22-

29, 34-36, 38, and 43 because this Petition establishes a reasonable likelihood of

prevailing with respect to each challenged claim. See 35 U.S.C. § 314(a).

-5-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

IV. OVERVIEW OF THE ’051 PATENT

A. Level of Ordinary Skill in the Art

A person of ordinary skill (“POSA”) as of April 20112 would have possessed at least a Master of Science in the areas of electrical engineering or computer science

(or equivalent degree), with some working knowledge of augmented reality, mobile gaming, and the associated technologies; or, alternatively, a Bachelor of Science in computer science (or equivalent degree) with at least two years of experience in the aforementioned areas. (Zyda, ¶¶12-16.)

B. Specification Overview

The ’051 Patent is directed to augmented reality (“AR”) systems that

“present[] [] virtual objects along side [sic] real-world elements.” (’051, 1:32-33.)

The claimed AR systems “adjust the presence of the [AR] objects within the [AR] experience” based on a user’s physical location. (Id., Abstract, 4:10-12, 16:9-16.)

The ’051 Patent acknowledges that systems and methods for modifying the appearance of virtual objects within augmented reality scenes were known. (Id.,

2 The ’051 Patent was filed on November 9, 2018, claiming priority to a succession of continuation application and one divisional application dating back to June 30,

2011. The ’051 Patent also claims priority to U.S. Provisional Application No.

61/473,324 (Ex. 1001), filed on April 8, 2011.

-6-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

1:37-41, 2:3-27.)

Figure 1 is a “schematic of an augmented reality ecosystem” according to the

’051 Patent (’051, 4:35):

(Id., Fig. 1.) As shown, the AR ecosystem “include[s] one or more databases including AR object repositories 140 storing AR objects 142[.]” (Id., 5:63-66.) The system “obtain[s] a digital representation of a scene” local to an “AR-capable device” such as a mobile device. (Id., 3:63-4:16.) The digital representation can

-7-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2 include “[e]lement attributes,” which include “myriad types of attributes” such as

“location information (e.g., relative location to elements 390 of a scene, triangulation, GPS coordinates, etc.)[.]” (Id., 16:4-19.)

The system can also “determine a context related to the scene based on the digital representation and pertaining to the target object.” (’051, 4:10-12.) The ’051

Patent uses the term “context” broadly, stating that “[c]ontexts can take on many different forms and can be defined as desired.” (Id., 8:23-24; see also id., 18:66-

19:1 (referring to “other contexts pertaining to circumstances of scene 595”).)

Determining a “context” in the ’051 Patent refers to using information such as the device’s location, the user’s identity, and the time of day to inform the system’s determination of what AR objects to present to the user. (See, e.g., id., 5:10-13

(“[C]ontemplated infrastructures determine a relevant augmented reality context from environment data representing a real-world environment local to an AR- capable devices.”), 9:21-26 (“[O]ne could still have a distinct context from other local devices, possibly based on device use, user identity or preferences, authorization, authentication, interference among other elements in AR ecosystem

100, or other attributes.”), 17:27-30 (“As scene changes with time, context 332 can ebb or flow, or even shift focus…from a first context to a second context.”).)

Once the system has identified a “context,” the system “identif[ies] a set of relevant AR objects from available AR objects with respect to the context based on

-8-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2 a derived interference among elements (e.g., real-world elements, virtual elements, etc.)[,]” and presents those relevant AR objects to the AR-capable device. (’051,

4:13-19.)

The presence of the relevant AR objects can be “enhanced” or “suppressed” depending on “the nature of the relevant AR objects[], the context, or other factors relating to the scene.” (’051, 18:18-21.) “At a most basic level, presence could simply mean relevant AR objects are present (enhanced) or not present

(suppressed).” (Id., 18:21-23.) Figure 4 provides another example where AR object

446A with “enhanced presence” appears larger, while AR object 446B with

“suppressed presence” appears smaller:

-9-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Id., Fig. 4.)

V. CLAIM CONSTRUCTION

As of the Petition’s filing, the claim construction process in the underlying litigation is ongoing, and the district court has made no claim construction ruling.

The parties dispute the construction of several claim terms. (See Ex. 1042.) For

-10-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

purposes of the present Petition, Petitioner has identified these potential disputes in

the following analysis so that the Board can be aware of them, but respectfully

submits that they need not be expressly resolved by the Board for purposes of

evaluating Petitioner’s proposed Grounds. As the detailed analysis below will

demonstrate, the prior art would satisfy the constructions proposed by either party.

VI. THE CHALLENGED CLAIMS ARE UNPATENTABLE

This Petition challenges claims 1, 5-11, 15, 18, 22-29, 34-36, 38, and 43

(collectively, the “Challenged Claims”), based on three Grounds of obviousness.3

Ground 1 is a single-reference obviousness ground based on Yu alone, and addresses

claims 1, 5-9, 18, 22-23, 27-29, 34, 36, 38, and 43. Ground 2 builds upon Ground 1

by adding Sanz-Pastor, and further addresses claims 10, 11, and 15. Ground 3 builds

upon Ground 2 by adding Mullen, and further addresses claims 24-26 and 35.

3 Claims 10, 11, 15, 18, 34, and 35 are not asserted in the underlying district court litigation. (See Ex. 1038, NantWorks’ Disclosure of Asserted Claims and

Infringement Contentions, at 1.)

-11-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

A. Summary and Date Qualification of the Prior Art4

Yu [Ex. 1003]

Yu, titled “Mobile Virtual and Augmented Reality System,” describes an AR system where “[a] user can create ‘virtual graffiti’ [] that will be left for a particular

device to view as part of an augmented-reality scene.” (Yu, Abstract.) The virtual

graffiti is an object that overlays the real scene, thus augmenting reality. Yu qualifies

as prior art under § 102(b) because it was published March 18, 2010, more than one

year before the earliest date to which the ’051 Patent claims priority.

The virtual graffiti in Yu can be “assigned to a particular physical location”

and “uploaded to a network server[.]” (Yu, Abstract.) “When a device that is

allowed to view the graffiti is near the location [to which the virtual graffiti is

assigned], the graffiti will be downloaded to the device and displayed as part of an

augmented-reality scene.” (Id.) Figure 2 illustrates three scenes where the virtual

graffiti is either not present (left scene) or present (middle and right scenes) atop the

“real-world door”:

4 As explained by Dr. Zyda, each of the prior art references described herein are

enabling. (Zyda, ¶¶511-513.)

-12-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Id., Fig. 2, ¶0024.)

Yu further discloses that the presence of the virtual graffiti, again overlaid

atop the real-world door, can be modified based on ambient-light conditions. (Yu,

Abstract, ¶¶0004, 0015, 0028, 0063-0068.) Figure 3 provides an example where

“the shadowing of a virtual object may be allowed to change based on, for example,

the position of the sun” (id., ¶0026):

-13-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Id., Fig. 3, ¶0027.) The presence of the virtual graffiti also changes based on “the physical position and orientation (i.e., viewing direction) of the device.” (Yu,

¶0032.)

Sanz-Pastor [Ex. 1004]

Sanz-Pastor describes an AR system where users can add “virtual sticky notes” (objects) to a particular physical location within an AR scene. (Sanz-Pastor,

¶0070, Abstract.) The point is for people to be able to communicate about, and on top of, real-life scenes using those virtual messages. Sanz-Pastor qualifies as prior art under § 102(b) because it was published October 18, 2007, more than one year before the earliest date to which the ’051 Patent claims priority. -14-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

Sanz-Pastor explains that users view the virtual sticky note messages “by

panning the [user] device, revealing the message’s real world location as icons and

text overlaid on top of the camera input on the display.” (Sanz-Pastor, Abstract.)

Additionally, Sanz-Pastor provides implementation details for “simulat[ing]

occlusion effects.” (Id.; see also, e.g., id., ¶0051 (“Message management module

202 determines a geometric intersection from a device’s location to the coordinate of each message, and by comparing the resulting range with the actual distance between the two points, determines whether the message is visible from the device’s position.”).) Figure 4 depicts an example where the virtual messages are overlaid on the real-world image to first responders in emergency situations:

-15-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Id., Fig. 4, ¶¶0044-0045.)

Sanz-Pastor further discloses that the user device can present virtual messages

“in an attenuated fashion by making them transparent when drawn using a different color coding.” (Sanz-Pastor, ¶0029.)

Mullen [Ex. 1005]

Mullen discloses AR systems providing “location-based games” where “a user’s physical location correlates to the virtual location of a virtual character on a virtual playfield.” (Mullen, Abstract, ¶0006.) Mullen qualifies as prior art under §

102(b) because it was published May 18, 2006, more than one year before the earliest

-16-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

date to which the ’051 Patent claims priority.

Figure 1 of Mullen illustrates a handheld location-based game system and

accompanying playmat:

(Mullen, Fig. 1, ¶¶0010, 0025, 0045-0052.) Mullen renders “[v]irtual objects”

within the “AR game system.” (Id., ¶¶0022, 0072, 0075-0076.) Mullen also

discloses “computer-controlled character positioning and movement based on

detected landscape objects[.]” (Id., ¶0030, Fig. 6.)

Mullen further discloses a “virtual currency system such that any user can

exchange real money for the virtual currency system, with the game manufacturer

taking a percentage cut, and the virtual currency can be used to buy/sell items.”

-17-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Mullen, ¶0056.) A user can then be “provided with the option to then convert the

virtual currency back to real currency (e.g., the U.S. dollar) with the game

manufacturer again taking a percentage of the conversion.” (Id.)

B. Ground 1: Single-Reference Obviousness of Claims 1, 5-9, 18, 22- 23, 27-29, 34, 36, 38, and 43 Over Yu

1. Claim 1

(a) “An augmented reality (AR) platform system comprising:” (Preamble)

To the extent this preamble is limiting, it is satisfied by Yu. Yu discloses an

AR platform system where “[a] user can create ‘virtual graffiti’ that will be left for a particular device to view as part of an augmented-reality scene.” (Yu, ¶0014; see also id., Title (“Mobile Virtual and Augmented Reality System”).)5 Yu’s “virtual

graffiti” is an “AR object” because it is a virtual object that is “present[ed] . . .

along[]side real-world elements” (’051, 1:32-33), such as a “real-world door” to a

restaurant. (Yu, ¶0024; Zyda, ¶100.) Figure 2 depicts an AR scene where the AR

object (virtual graffiti) is superimposed on a real-world door:

5 Unless noted otherwise, all emphasis and highlighting have been added by

Petitioner.

-18-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Yu, Fig. 2; see also id., Fig. 3, ¶0027.) Yu further discloses that the virtual graffiti

can comprise “images, audio and video clips, etc.” (Yu, ¶0025; cf. ’051, 19:66-

20:21.) Like the ’051 Patent (’051, 3:58-4:1), Yu describes “an apparatus for

receiving and displaying virtual graffiti [AR objects] as part of an augmented-reality

scene.” (Yu, ¶0020; Zyda, ¶106.)

(b) “an AR object repository storing available AR objects in a first non-transitory computer readable memory; and” (1[a])

Yu renders obvious claim 1[a]. Yu discloses a “database 403” that stores AR

objects (virtual graffiti) at the server, as shown in Figure 4:

-19-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Yu, Fig. 4, ¶0036.) Yu explains that a “global object manager 401” facilitates the

storing of the AR objects at database 403. (Id., ¶¶0036, 0051-0052.) A POSA would

have understood that Yu’s database 403 is an “AR object repository” because it

“stor[es] available AR objects.” (’051, 7:59-60; cf. Yu, ¶0036 (“All of the above information [e.g., virtual graffiti] is stored in database 403.”); Zyda, ¶111.)

Yu further teaches that database 403 stores AR objects “in a first non- transitory computer readable memory” (CRM). A POSA would have found it obvious that such storage would require a non-transitory CRM so that AR objects could later be easily retrieved. (Zyda, ¶112.)

To the extent that NantWorks argues that “AR object repository” means

“memory storing AR objects,” Yu satisfies this limitation. A POSA would have found it obvious that a memory is the hardware component by which database 403 stores AR objects, as such storage would be impossible without memory. (Zyda, -20-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

¶114.) A POSA would have further found it obvious that Yu’s virtual graffiti (AR

object) is a “data structure of at least a virtual object image for integration into an

AR experience,” consistent with NantWorks’ proposed construction in district court.

(Zyda, ¶113; Yu, ¶¶0017, 0024-25.)

(c) “an AR server coupled with the AR object repository and, upon execution of software instructions stored in a second non-transitory computer readable memory by a processor, is configured to:” (1[b])

Yu renders obvious claim 1[b]. Yu discloses an “AR server,” namely, a

“virtual graffiti server” 101 that provides virtual graffiti to user devices:

(Yu, Fig. 1, ¶¶0021, 0023, 0041.) A POSA would have understood that Yu’s virtual

graffiti server is an “AR server” because, as explained above for claim 1[a], the

virtual graffiti is an AR object which will be used alongside—and displayed on top

of—the real-world objects (like the door in the figures). (Zyda, ¶118.)

-21-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

It would have been obvious to a POSA that the server in Yu is “coupled with

the AR object repository.” The ’051 Patent states that the term “coupled to”

includes “both direct coupling (in which two elements that are coupled to each other

contact each other) and indirect coupling (in which at least one additional element is

located between the two elements).” (’051, 5:28-34.) A POSA would have

understood that Yu’s server 101 and database 403 are directly coupled because

database 403 is physically part of server 101, as shown in Figure 4, and thus the two

components “contact each other.” (Yu, Fig. 4, ¶0036; see also id., Fig. 7.) At

minimum, a POSA would have understood that the server 101 and database 403 are

“indirectly coupled,” with at least one additional element located between the two

elements. (Id., ¶¶0051-0052; Zyda, ¶120.)

Yu further discloses that the AR platform system is “configured to” perform

certain steps “upon execution of software instructions stored in a second non-

transitory computer readable memory by a processor.” Regardless of whether

the “second non-transitory [CRM]” is on the same or a separate device from the

“first non-transitory [CRM],” Yu discloses this limitation. (Zyda, ¶¶121-129.)

First, to the extent that the first and second CRMs are on separate devices, Yu

discloses a user device that comprises “logic circuitry 505”:

-22-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Yu, Fig. 5, ¶¶0038, 0041.)6

The logic circuitry 505 performs a series of steps set forth in Figure 8:

6 For Claim 1[b], NantWorks’ infringement contentions accuse elements of Niantic’s products similar to Yu’s logic circuitry 505 on the user device. (Zyda, ¶126.)

-23-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Yu, Fig. 8, ¶¶0056-0058.) A POSA would have understood that the steps set forth

in Figure 8 are “software instructions” as recited in claim 1[b], because steps 801,

803, 805, 807, 809, 811, and 813 provide a sequence of instructions that make the

system work. (Zyda, ¶124.) A POSA would have further understood that the

software instructions in Figure 8 of Yu are “stored in a second non-transitory

[CRM]” because the logic circuitry 505 resides in the user device (Yu, ¶0038)—

-24-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

separate from the server where the AR objects are stored in database 403. (Zyda,

¶125.) Additionally, a POSA would have found it obvious that Yu’s software

instructions are stored in a non-transitory CRM, to allow for subsequent retrieval

and execution. (Id.)

Second, to the extent that the first and second CRMs are on the same device,

Yu discloses another example of software instructions at Figure 9, which shows the

operation of “ambient light modification circuitry 507”:

(Yu, Fig. 9, ¶¶0038, 0059-0068.) A POSA would have known that the ambient light

modification circuitry constitutes “software instructions” because steps 901, 902,

903, and 905 provide a sequence of instructions that make the system work. (Zyda,

¶128.) Yu states that the ambient light modification circuitry “may be centrally

-25-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

located within server 101.” (Yu, ¶0059.) A POSA would have understood that these

software instructions are nonetheless “stored in a second non-transitory [CRM],” because servers in April 2011 commonly had multiple memory components for efficiency purposes. (Zyda, ¶129.)

Moreover, it would have been obvious to a POSA that the software

instructions in Yu’s Figures 8 (separate device) and 9 (same device) are “execut[ed]

. . . by a processor.” (Zyda, ¶130; Yu, ¶0034 (“These messages are then downloaded by the device and processed.”).) Without a hardware component such as a processor, the AR platform system could not carry out the recited functions.

(Zyda, ¶130; Ex. 1039 at 005-06, 007, 004.)

(d) “obtain digital data representative of an environment of an AR capable mobile device, the digital data including a device location of the AR capable device and a virtual element attribute;” (1[c])

Yu satisfies claim 1[c]. Yu discloses an AR platform system that, upon

execution of the software instructions in claim 1[b], “obtain[s] digital data

representative of an environment of an AR capable mobile device.” The logic

circuitry “provides a location to transmitter 511 to be transmitted to server 101,” as

shown at step 801 of Figure 8. (Yu, ¶0056.) Next, “[i]n response to the step of

providing the location, at step 803, receiver 513 receives information necessary to

view the virtual graffiti”:

-26-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Yu, Fig. 8, ¶0056.) The information obtained at step 803 may include a “digital

representation of a physical object” attached to the AR object (virtual graffiti), or “a

precise location of the virtual graffiti.” (Yu, ¶¶0056, 0029.) The “physical object”

such as the restaurant door shown in Figure 2, is part of the real-world environment

-27-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

of an AR capable mobile device. (Id., ¶0016; see also id., ¶0030 (disclosing that the

user device is a “mobile device”).)

Yu further describes that beyond obtaining location information, the Yu

system also obtains data representative of the ambient light conditions of an

environment, as shown at steps 901-903:

(Yu, Fig. 9.) For example, Yu obtains digital data including “whether or not the

device is indoors or outdoors,” “position data for the sun,” and “local weather data,

such as amount of cloud cover.” (Id., ¶¶0060-0062.) A POSA would have readily

understood that such digital data is “representative of an environment of the AR

capable mobile device.” (Id., ¶0060; Zyda, ¶138.)

Yu also discloses that “the digital data includ[es] a device location of the

AR capable device and a virtual element attribute.” As noted, Yu states that the

-28-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

digital data obtained pursuant to step 803 may include “a precise location of the

virtual graffiti” attached to a physical object in the real-world environment of the

mobile device. (Yu, ¶0056.) Figure 6 confirms that “personal object manager 405”

obtains digital data including a device location:

(Id., Fig. 6, ¶¶0015, 0037, 0052.)

Yu further teaches that the digital data includes a “virtual element attribute.”

While the ’051 Patent does not expressly define this claim term, it describes

“[e]lement attributes, AR object attributes, or other attributes” as “includ[ing]

-29-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

myriad types of attributes” such as “location information (e.g., relative location to elements 390 of a scene, triangulation, GPS coordinates, etc.)[.]” (’051, 16:4-19.)

As explained above, Yu discloses that the AR platform system obtains digital data including ambient-light information in a three-step process at steps 901-903. (Yu,

¶¶0060-0062.) A POSA would have understood that this ambient light information is a “virtual element attribute” as described in the ’051 Patent, because the indoor/outdoor data, position data for the sun, and local weather data all relate to the location of the mobile device on which the virtual element will be placed. (Zyda,

¶¶140-141.) Additionally, as explained below for Claim 1[f], Yu discloses determining whether to alter the presence of the virtual graffiti (AR object) based on ambient light information (virtual element attribute).7

7 When the steps of a claim do not “otherwise recite an order,” as here, generally no order is required unless the specification directly or implicitly requires it. E.g.,

Altiris v. Symantec, 318 F.3d 1363, 1369 (Fed. Cir. 2003). Here, Claim 1 does not recite an order that would require “obtain[ing]” the virtual element attribute before performing the “identify[ing]” step in claim 1[e]. (Zyda, ¶142.) Therefore, even if

Yu performs the recited steps in an order different from that claimed in the ’051

Patent, Yu still renders the Challenged Claims obvious.

-30-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(e) “determine at least one context related to the AR capable device and pertinent to the environment based at least on the device location;” (1[d])

Yu satisfies claim 1[d]. Yu discloses a “context-aware messaging system”

where a “context-aware circuitry” “generat[es] a current context for the user device.”

(Yu, ¶¶0021, 0038, Fig. 5.) Yu provides a series of steps for “determin[ing] at least

one context related to the AR capable device,” such as the device’s location

relative to (i.e., proximity to) the AR object within the environment. (Yu, Abstract,

¶¶0023, 0037, 0052.) Specifically, Yu discloses that “[s]erver 101 periodically

monitors the locations of all devices” to determine whether “a particular device is

near a location where it is to be provided with virtual graffiti.” (Id., ¶0023.) The device’s proximity to the AR object (a context) is “pertinent to the environment based at least on the device location” because it relates to the device’s location within the environment. (Zyda, ¶147.)

Yu provides various other examples of “context,” including a device’s

viewpoint (Yu, ¶0032) and identity (id., ¶¶0024, 0037, 0053). “After obtaining the virtual coordinates of the virtual graffiti,” Yu “establishe[s]” a “viewpoint” that “is completely dependent upon the physical position and orientation (i.e., viewing direction) of the device.” (Id., ¶0032; see also id., ¶0015 (“[I]n an outdoor environment, the context available to the mobile device (time, location, and orientation) can be acquired[.]”).) This viewpoint (a context) is “pertinent to the

-31-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

environment based at least on the device location” because it relates to the

device’s position and orientation (i.e., location) within the environment. (Zyda,

¶149.) Yu also determines the user device’s “identity” (Yu, ¶0037), because the virtual graffiti “is only available to those designated able to view it[.]” (Id., ¶0024,

Fig. 2.) The device’s identity (another context) is likewise “pertinent to the environment based at least on the device location” because the user device itself exists within the environment at a certain location. (Zyda, ¶151.)

The contexts disclosed in Yu comport with the disclosures of the ’051 Patent, which broadly states that “[c]ontexts can take on many different forms and can be defined as desired,” and can “pertain[] to circumstances of a scene[.]” (’051, 8:23-

24, 18:66-19:1, 11:4-6, 12:13-16, 14:16.) A POSA would have understood that the contexts disclosed in Yu (i.e., the device’s viewpoint, identity, and proximity to

virtual graffiti) each “pertain[] to circumstances of a scene” and thus are “contexts”

as described by the ’051 Patent. (Zyda, ¶153.) Additionally, Yu’s disclosure of

determining context based on the user device’s identity (Yu, ¶0037) parallels the

’051 Patent’s example of determining context based on a “user identity or

preferences” (’051, 9:21-26). (Zyda, ¶153.)

In district court, NantWorks asserts that Claim 1[d] means “determine at least

one augmented reality experience on the AR capable device based on real world

element attributes related to the local environment of the AR capable device.” (Ex.

-32-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

1042 at 003.) This phrase requires no express construction, but Yu nevertheless

satisfies Claim 1[d] under NantWorks’ proposed construction, because Yu teaches

determining the user device’s relative location, viewpoint, and identity—which are

related to the device’s local environment. (Zyda, ¶154.)

(f) “identify relevant AR objects from the AR object repository representing available AR objects corresponding to the at least one context;” (1[e])

Yu satisfies claim 1[e]. Yu discloses “determin[ing] if the location of a device

is near any stored virtual graffiti.” Specifically, as shown at step 607 of Figure 6,

Yu asks the question—is there any virtual graffiti stored that relates to the location

of the device?:

-33-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Yu, Fig. 6, ¶0052.) Yu explains that its system “filters all available virtual graffiti

in order to determine only the virtual graffiti relevant to the particular device and the

device’s location.” (Id., ¶0037.) Yu therefore teaches “identify[ing] relevant AR

objects from the AR object repository representing available AR objects.”

(Zyda, ¶158.) A POSA would have understood that the available virtual graffiti (AR

objects) “correspond to the at least one context,” because “the relevant virtual

graffiti changes based on the identity and location of the device” (Yu, ¶0037), and context is “based at least on the device location,” as recited in Claim 1[d]. (Zyda,

-34-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

¶158.) Yu also discloses identifying relevant AR objects from the available AR

objects corresponding to the user device’s identity (another context). (Yu, ¶¶0024,

0037, Fig. 6; Zyda, ¶159.)

In district court, NantWorks asserts that Claim 1[e] means “identify a set of

AR objects from the AR repository based on context and identify relevant AR

objects that are within that set.” (Ex. 1042 at 005.) This phrase requires no express construction, but Yu nevertheless satisfies Claim 1[e] under NantWorks’ proposed construction, because Yu teaches identifying relevant virtual graffiti from a set of stored virtual graffiti based on the device’s location. (Zyda, ¶160; Yu, Fig. 6.)

(g) “determine whether to alter presence of a relevant AR object based on at least the device location and the virtual element attribute; and” (1[f])

Yu renders obvious claim 1[f]. Yu discloses determining whether to modify

the virtual graffiti based on ambient light conditions (a virtual element attribute), as

illustrated in steps 805, 807, and 809 of Figure 8:

-35-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Yu, Fig. 8, ¶0057; see also id., Fig. 9, ¶0063.)

Yu states that “[s]ome of the possible modifications to the graffiti are:

casting a virtual shadow for the graffiti when it is determined that the sun is shining. The determination that the sun is shining may be made via local-weather data, an ambient light source, and/or

-36-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

whether or not the device is indoors or outdoors. The intensity of virtual shadow can also be adjusted based on the ambient light[;]

brightening the virtual graffiti if an ambient-light sensor determines that the device is in a dark place[;]

adjusting the color of the virtual graffiti to increase or decrease its visibility based on the ambient light[;]

changing a texture map to alter the appearance of the virtual graffiti based on the ambient light[; and]

adding a specular highlight at a particular location on the virtual graffiti based on the relative position of the sun to the virtual graffiti[.]”

(Yu, ¶¶0063-0068.) Yu further explains that “the modification may comprise

modifying any combination of shadow, brightness, contrast, color, specular

highlights, or texture maps in response to the ambient light.” (Id., ¶0063.) A POSA

would have found it obvious that the presence of the virtual graffiti (AR object) is

altered “based at least on the device location and the virtual element attribute”

because the modifications described in Yu are based on at least the device location

-37-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

and the ambient light conditions (the virtual element attribute). (Zyda, ¶¶165-166.)8

A POSA also would have understood that the modifications in Yu “alter [the] presence” of the virtual graffiti. (Zyda, ¶¶167-172.) The ’051 Patent states that

“[a]t a most basic level, presence could simply mean relevant AR objects are present

(enhanced) or not present (suppressed). Still, presence can cover a full spectrum of

experiences,” such as where “the visual image of [the] relevant AR object 446B

might have shades of transparency to indicate a suppressed presence.” (’051, 18:21-

29.) Figure 4 of the ’051 Patent depicts an example of what the ’051 Patent describes

as an AR object with an “enhanced presence” that appears larger; the ’051 Patent

contrasts this with an AR object with a “suppressed presence” that appears smaller:

8 Although the ’051 Patent does not expressly state whether or not the “context” can serve as the “virtual element attribute,” the specification suggests that it can, because the ’051 Patent teaches that the context can be used to determine whether to the alter presence of the AR objects. (’051, 18:18-21 (determination to enhance or suppress presence may depend on “the nature of the relevant AR objects[], the context, or other factors relating to the scene.”).) But even if “context” and “virtual element attribute” are distinct, this claim limitation is satisfied by Yu, for example, where the context is the user’s identity and the virtual element attribute is the ambient light conditions. (Zyda, ¶166 & n.3.)

-38-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(’051, Fig. 4, 17:63-66, 18:4-6.)

Yu’s virtual graffiti modifications are consistent with the teachings of the ’051

Patent. First, Yu discloses that the virtual graffiti is either “present” or “not present”

(’051, 18:21-23), depending on the location of the device and the location of the

virtual graffiti. (Yu, Abstract (“When a device that is allowed to view the graffiti is

near the location, the graffiti will be downloaded to the device and displayed as part

-39-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

of an augmented-reality scene.”).) For example, Figure 2 of Yu illustrates three

scenes where the virtual graffiti is not present (scene at left) versus present (scenes

at middle and right):

(Id., Fig. 2, ¶¶0024, 0032.) 9

Second, Yu states that, “[t]o further enhance the user experience, the virtual graffiti can be dynamic, changing based on ambient-light conditions.” (Yu,

Abstract, ¶¶0015, 0063-0068.) Figure 3 illustrates an example of this, where “the

9 For Claim 1[f], NantWorks’ infringement contentions accuse functionality in

Niantic’s products similar to displaying or not displaying the AR object based on the user device’s location. (Zyda, ¶¶173-175.)

-40-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

shadowing of a virtual object may be allowed to change based on [] the position of

the sun” (id., ¶0026):

(Id., Fig. 3, ¶¶0027-0028; Zyda, ¶170.) Similarly, Yu discloses altering the presence of the virtual graffiti based on the device’s position and orientation (i.e., viewpoint).

(Yu, ¶0032.)

In district court, NantWorks asserts that “alter presence” means “the degree

of at least a relevant AR object’s visual presentation based on a spectrum.” (Ex.

1042 at 005.) NantWorks also asserts that the full claim limitation recited in Claim

1[f] means “[d]etermine the degree of at least a relevant AR object’s visual

presentation based on a spectrum based on at least the device location, AR object

-41-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2 attributes, and the virtual element attribute.” (Id. at 006.) These phrases require no express construction, but Yu nevertheless satisfies Claim 1[f] under NantWorks’

proposed constructions, because Yu states that the virtual graffiti “may change any

combination of shadow, brightness, contrast, color, specular highlights, or texture

maps in response to the ambient light.” (Yu, ¶0028; Zyda, ¶¶176-177.)

(h) “cause the AR capable device to render the relevant AR object according to its altered presence.” (1[g])

Yu renders obvious claim 1[g]. Yu teaches that the virtual graffiti (AR object)

is rendered at the user device according to its altered presence—for example, with a

“shadow 303 that changes with the time of day” (Yu, ¶0027):

(Yu, Fig. 3, ¶¶0015, 0063-0068.) -42-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

Yu further discloses that the virtual graffiti is either displayed or not displayed

based on the user device’s location and the virtual graffiti’s location, as shown in

Figure 2:

(Yu, Fig. 2, ¶0024.)

A POSA therefore would have understood that Yu renders the virtual graffiti

(AR object) according to its altered presence, by either displaying or not displaying

it, or by displaying it with a shadow that changes with the time of day. (Zyda, ¶¶180-

184.)

2. Claim 5: “The system of claim 1, wherein the AR server is a remote server coupled with the AR capable device via a wireless network.”

Yu satisfies claim 5. Figure 1 of Yu depicts a context-aware messaging -43-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

system comprising a virtual graffiti server 101, network 103, and user devices 105-

107:

(Yu, Fig. 1, ¶0021.)

A POSA would have understood that Yu’s “virtual graffiti server 101” is an

“AR server” because the virtual graffiti is an AR object. (Zyda, ¶189.) A POSA

also would have understood that the virtual graffiti sever is a “remote server”

because it is “remote from the location of the AR-capable device[s]” (see ’051, 6:47),

as shown in Figure 1 above. (Id.) A POSA would have further found it obvious that

Yu’s remote AR server is “coupled with the AR capable device” because it is in

“contact” with the user devices via the network, or at least has one additional element

between the AR server and user devices. (Zyda, ¶191; ’051, 5:28-34.)

Yu further discloses that the network coupling the AR server with the AR

capable device is “wireless.” (Yu, ¶0036 (“This information is preferably received -44-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

wirelessly through receiver 407.”); see also id., Fig. 6, ¶¶0037, 0052.)

3. Claim 6: “The system of claim 1, wherein the relevant AR object is caused to be rendered based on a position of the AR capable device relative to the environment.”

Yu satisfies claim 6. Yu discloses that “a viewpoint must be established for

the 3D rendering system to be able to render the virtual scene.” (Yu, ¶0032.) The

viewpoint is “completely dependent upon the physical position and orientation (i.e., viewing direction) of the device.” (Id.; see also id., ¶0034.) Yu discloses that the virtual graffiti is “caused to be rendered” based on the position of the AR capable device relative to the environment. (Yu, ¶¶0032, 0034-0035.) A POSA would have understood that the position of the device is “relative to the environment,” because

the position is based on the device’s location within the environment. (Zyda, ¶195;

Yu, ¶0061 (describing “determin[ing] position data for the sun . . . by determining a local time and date, and calculating the position for the sun based on the local time and date.”).)

4. Claim 7: “The system of claim 1, wherein the relevant AR object is caused to be rendered based on an orientation of the AR capable device relative to the environment.”

Yu satisfies claim 7. The viewpoint required to render the virtual scene in Yu is “completely dependent upon the physical position and orientation (i.e., viewing direction) of the device.” (Yu, ¶¶0032, 0034.) Yu discloses that the virtual graffiti is “caused to be rendered” based on the orientation of the AR capable device

-45-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

relative to the environment. (Yu, ¶¶0032, 0034-0035.) A POSA would have

understood that the orientation of the device is “relative to the environment,”

because the orientation is based on the device’s location within the environment.

(Zyda, ¶199; Yu, ¶0061 (describing “determin[ing] position data for the sun” that

“preferably comprises an apparent geocentric position such as a right ascension and

declination for the sun”).)

5. Claim 8: “The system of claim 1, wherein the relevant AR object is caused to be rendered within an AR game.”

Yu renders obvious claim 8. Yu does not expressly disclose that the virtual

graffiti is caused to be rendered within an AR game, but teaches that the virtual

object “can be a primitive polygon or a complex polyhedron representing an

avatar[.]” (Yu, ¶0039.) A POSA would have understood that avatars were

commonly used in games by April 2011, and that Yu’s system thus could have been

used in a game. (Zyda, ¶203.) Further, the use of mobile games in the field of AR

was pervasive by April 2011. (Zyda, ¶¶52-59, 203.) As early as 2000, a mobile AR

game called “ARQuake” allowed players to shoot at virtual monsters overlaid on a

real-world environment. (Id., ¶203.) Therefore, a POSA would have been motivated

to apply the teachings of Yu to render the virtual graffiti within an AR game such as

ARQuake. (Id.)

Yu already discloses various social settings and interactions among friends

using virtual graffiti on mobile phones. (Yu, Fig. 6, ¶¶0003, 0017, 0024.) As Dr. -46-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

Zyda explains, a POSA would have found it obvious to apply the teachings of Yu to

provide users with an AR game, such as a scavenger hunt or communicating with

team members in a known game like ARQuake. (Zyda, ¶204.) Yu already describes

associating virtual objects with real-world elements (e.g., Yu, ¶0022), which the

’051 Patent contemplates using as “a game goal” (such as the object to find in a

scavenger hunt or the monsters to shoot at, as in ARQuake). (’051, 12:6-13.)

6. Claim 9: “The system of claim 1, wherein the relevant AR object is caused to be rendered by superimposing a visual image of at least one relevant AR object over an image of the environment.”

Yu satisfies claim 9. Yu discloses that the virtual graffiti (AR object) is

attached to the real-world location of a physical object. (E.g., Yu, ¶0024.) Figure 2

depicts a visual image of a virtual graffiti message stating, “Joe, try the porter,”

attached to a restaurant door in the real-world environment:

-47-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Id., Fig. 2, ¶0024.) As shown, the virtual graffiti is “superimposed onto the real images captured by the phone camera when they are displayed on the screen.” (Id.,

¶0003; see also id., ¶¶0033-0034.)

7. Claim 18: “The system of claim 1, wherein the presence of the relevant AR object is altered to include a non-visible presence.”

Yu satisfies claim 18. Yu discloses that the virtual graffiti (AR object) can

include audio. (Yu, ¶¶0023 (“When a particular device is near a location where

virtual graffiti is available for viewing, the device will notify the user, for example,

by beeping.”), 0025 (“Additionally, although the virtual graffiti of FIG. 2 comprises text, the virtual graffiti may also comprise a ‘virtual object’ such as images, audio and video clips, etc.”), 0039 (“The virtual graffiti preferably, but not necessarily,

-48-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

comprises at least two parts, a virtual object and content. . . . The content is

preferably either text, pre-stored images such as clip art, pictures, photos, audio or video clips, . . . , etc.”).) A POSA would have understood that audio is a “non- visible presence.” (’051, 9:48-60.)

8. Claim 22: “The system of claim 1, wherein the determination of whether to alter presence of the relevant AR object depends on a time.”

Yu satisfies claim 22. Yu discloses determining whether to alter the presence of the virtual graffiti (AR object) based on “the time of day.” (Yu, ¶0027; see also id., ¶¶0015, 0017.) For example, Yu’s context-aware circuitry determines position data for the sun “by determining a local time and date, and calculating the position for the sun based on the local time and date.” (Id., ¶0061.) Figure 3 illustrates virtual graffiti displayed with a shadow that “changes with the time of day” (id.,

¶0027):

-49-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Yu, Fig. 3.)

9. Claim 23: “The system of claim 22, wherein the presence alteration of the relevant AR object changes with the time.”

Yu satisfies claim 23. As explained for Claim 22, Yu discloses that the

presence alteration of the virtual graffiti “changes with the time of day.” (Yu,

¶0027.) For example, the shadow (an AR object) is displayed to the right of the

virtual graffiti at a “first time of day” and to the left of the virtual graffiti at a “second

time of day” (id.):

-50-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Id., Fig. 3, ¶¶0015, 0017, 0026.)

10. Claim 27: “The system of claim 1, wherein the AR server is further configured to enable the AR capable device to populate the environment with at least one of the relevant AR objects.”

Yu satisfies claim 27. Yu discloses that the virtual graffiti server (“AR

server”) is configured to enable the user devices to populate the environment with

AR objects like the virtual graffiti. (Yu, ¶0023.) Figure 4 is a block diagram of Yu’s

server, which includes a “personal object manager 405”:

-51-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(Id., Fig. 4.) Yu states that “each user device will have its own personal object

manager 405.” (Id., ¶0037.) The personal object manager “provides the device with the relevant information needed to display the virtual graffiti based on the location of the device, wherein the relevant virtual graffiti changes based on the identity and location of the device.” (Id.) Thus, the virtual graffiti server (AR server) “enable[s] the AR capable device to populate the environment” with AR objects. (See ’051,

21:15-26.)

11. Claim 28: “The system of claim 1, wherein the environment comprises an AR environment.”

Yu satisfies claim 28. Yu discloses that the environment of the user device is an “AR environment.” (E.g., Yu, ¶0019 (describing “displaying the modified virtual graffiti as part of an augmented-reality scene”).) The ’051 Patent describes

AR as “represent[ing] a presentation of virtual objects along side [sic] real-world

-52-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

elements.” (’051, 1:32-33.) Likewise, Yu “merges any virtual graffiti messages

with the user’s view of the real world.” (Yu, ¶0016.)

The ’051 Patent also states that the “AR ecosystem” of Figure 1 “provides an

overview of an AR environment in which AR-capable device 110 can have

interactions with one or more AR objects 142.” (’051, 5:57-59.) To the extent that

the claimed “AR environment” requires an environment in which the user device

can interact with AR objects, Yu discloses this limitation. (Zyda, ¶¶232-235; Yu,

¶0016 (describing “enhanc[ing] the user’s interactions with, or perceptions of the

environment”).) For example, Yu allows a user to interact with virtual graffiti by

creating it and attaching it to a real-world location. (Yu, Fig. 2, ¶0024; cf. ’051,

21:16-18.) Yu further states that another user with privileges can view the virtual graffiti in the AR environment, and thus interact with the virtual graffiti. (Yu,

¶0024.) Yu also provides that “the digital representation [of the virtual graffiti] could include URLs to 3D models and content (e.g., photos, music files, etc.).” (Id.,

¶0054; Zyda, ¶234.) A POSA would have understood that a user device can “have interactions” with a URL in an AR environment, for example by clicking on it.

(Zyda, ¶235.)

-53-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

12. Claim 29: “The system of claim 28, wherein the AR environment includes real-world elements and virtual elements.”

Yu satisfies claim 29. Yu discloses that the AR environment includes real-

world objects, such as a “real-world door,” and virtual graffiti (virtual elements).

(Yu, ¶¶0014, 0017, 0024, 0027, Figs. 2-3; cf. ’051, 1:32-33; see also Zyda, ¶¶28-

63.)

13. Claim 34: “The system of claim 1, wherein the AR server is further configured to enable the AR capable device to have an interaction with the rendered relevant AR object.”

Yu renders obvious claim 34. As explained above for Claim 28, Yu teaches

that the virtual graffiti server (AR server) allows the user device to interact with the

virtual graffiti (AR objects) by showing that the user can either see them or not, or

that the object may include a URL, which is clearly intended to be clicked (i.e.,

interacted with). (Yu, ¶¶0024 (disclosing that a user can “creat[e]” or “view” virtual graffiti), 0054 (disclosing that the virtual graffiti “could include URLs”); id., ¶0016; cf. ’051, 21:16-18; Zyda, ¶¶241-242.)

14. Claim 36: “The system of claim 1, wherein the mobile device is a cell phone.”

Yu satisfies Claim 36. Yu describes the user devices as “mobile device[s].”

(Yu, ¶¶0015, 0030, 0033, 0034.) Yu also describes its invention as relating to

“messaging with a mobile virtual and [AR] system.” (Id., ¶0002.) Yu further states

-54-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

that “one of the simplest ways to send a message to another individual is to send a

text message to the individual’s cellular phone.” (Id., ¶0003.)

15. Claim 38: “The system of claim 1, wherein the mobile device is a tablet computer.”

Yu satisfies claim 38, disclosing that the user interface in one embodiment

can specifically comprise “an electronic tablet capable of obtaining virtual objects from graffiti database 508 and creating handwritten messages and/or picture.” (Yu,

¶0039.)

16. Independent claim 43

(a) “An Augmented Reality (AR) server coupled with an AR object repository and configured to:” (Preamble)

The preamble of claim 43 is nearly identical to Claim 1[b], with immaterial

differences that do not affect the obviousness analysis. (Zyda, ¶254.) Thus, Yu

renders obvious the preamble of Claim 43 for the same reasons explained above for

Claim 1[b]. (Id., ¶256.)

(b) “obtain digital data representative of an environment of an AR capable mobile device, the digital data including a device location of the AR capable device and a virtual element attribute;” (43[a])

Claim 43[a] is identical to claim 1[b], and therefore is rendered obvious by

Yu for the same reasons explained above for claim 1[b]. (Zyda, ¶256.)

(c) “determine at least one context related to the AR capable device and pertinent to the environment based at least on the device location;” (43[b])

-55-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

Claim 43[b] is identical to claim 1[d], and therefore is rendered obvious by

Yu for the same reasons explained above for claim 1[d]. (Zyda, ¶257.)

(d) “identify relevant AR objects from the AR object repository representing available AR objects corresponding to the at least one context;” (43[c])

Claim 43[c] is identical to claim 1[e], and therefore is rendered obvious by

Yu for the same reasons explained above for claim 1[e]. (Zyda, ¶258.)

(e) “determine whether to make present relevant AR objects based on at least the device location and the virtual element attribute; and” (Claim 43[d])

Claim 43[d] is nearly identical to claim 1[f], with the difference that claim

43[d] recites determining whether to “make present” relevant AR objects, while

claim 1[f] recites determining whether to “alter presence” of a relevant AR object.

(Zyda, ¶259.) A POSA would have understood that “mak[ing] present” is one way

of “alter[ing] [the] presence” of an AR object. (Id., ¶261) Thus, Yu renders obvious

claim 43[d] for the same reasons explained above for claim 1[f]. (Id.)

To the extent there are any meaningful differences between “make present”

and “alter presence,” a POSA would have found that Yu teaches determining

whether to “make present” relevant AR objects. (Zyda, ¶¶261-262.) Yu discloses

that the virtual graffiti is either “present” or “not present” (see ’051, 18:21-23),

depending on the location of the user device and the location of the virtual graffiti.

(Yu, Abstract, ¶¶ 0015, 0024, 0032, Fig. 2.)

-56-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(f) “cause the AR capable device to render one or more of the relevant AR objects if they have been determined to be made present.” (43[e])

Claim 43[e] is nearly identical to claim 1[g], with the difference that claim

43[e] recites causing the AR capable device to render “one or more” of the relevant

AR objects “if they have been determined to be made present,” while claim 1[g]

recites causing the AR capable device to render the relevant AR object “according

to its altered presence.” (Zyda, ¶264.) A POSA would have understood that

rendering an AR object “if [it] [has] been determined to be made present” is one way

of rendering an AR object “according to its altered presence.” (Id., ¶265.) Thus, Yu

renders obvious Claim 43[e] for the same reasons explained above for claim 1[g].

(Id.)

To the extent there are any meaningful differences between claim 43[e] and

claim 1[g], a POSA would have found that Yu teaches rendering AR objects “if they

have been determined to be made present.” (Zyda, ¶266.) Yu discloses that the user

device renders the virtual graffiti if it has been determined to be made present. (Yu,

Figs. 2-3, Abstract, ¶¶0024, 0027.)

C. Ground 2: Obviousness of Claims 1, 5-11, 15, 18, 22-23, 27-29, 34, 36, 38, and 43 Over Yu in View of Sanz-Pastor

As discussed for Ground 1, Yu renders obvious each and every limitation of

claims 1, 5-9, 18, 22-23, 27-29, 34, 36, 38, and 43, including the requirement of

independent claim 1 that system “determine whether to alter presence of a relevant -57-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

AR object.” (See supra Section VII.B.1(g).)

To the extent Yu alone does not satisfy the “alter presence” limitation, it

would have been obvious to a POSA to combine Yu with Sanz-Pastor, which would

result in the system altering the presence of a relevant AR object. Specifically, in

the underlying litigation, NantWorks proposes construing “altered presence” as

“[t]he degree of at least a relevant AR object’s visual presentation based on a

spectrum.” (Ex. 1042 at 005.) NantWorks has not explained what is meant by “the

degree” of an object’s “visual presentation,” but the ’051 Patent states that “the

visual image of relevant AR object 446A and 446B might have shades of

transparency to indicate a suppressed presence.” (’051, 18:27-29.) To the extent

NantWorks argues that “the degree” of “visual presentation” requires “shades of transparency to indicate a suppressed presence,” this would have been obvious in light of the combination with Sanz-Pastor, as described in further detail below.

(Sanz-Pastor, ¶0029 (“Device 300 can also determine occlusion information for each message, and not present occluded messages or present them in an attenuated fashion by making them transparent when drawn using a different color coding.”).)

The combination of Yu with Sanz-Pastor additionally renders obvious

dependent claims 10, 11, and 15.

1. Independent Claim 1

The combination of Yu and Sanz-Pastor renders claim 1 obvious. Like Yu,

-58-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

Sanz-Pastor teaches AR platform system where virtual objects (referred to in Sanz-

Pastor as “virtual messages”) are presented atop real-world elements. (See, e.g.,

Sanz-Pastor, Title, Abstract, ¶¶0015, 0070.) It would have been obvious to a POSA to combine Yu with Sanz-Pastor to implement Yu’s AR platform system with the ability, taught by Sanz-Pastor, to use transparency to visually indicate occlusion of the virtual objects. (Zyda, ¶¶269-408.)

Like Yu, Sanz-Pastor’s disclosures closely track claim 1 of the ’051 Patent.

(See Zyda, ¶¶270-332.) In particular, implementing Yu’s AR platform system to visually indicate occlusions, as taught by Sanz-Pastor, implicates claim elements

1[c], 1[f], and 1[g]—the claim elements pertaining to, respectively, obtaining a virtual element attribute, determining whether to alter presence of an AR object based on a virtual element attribute, and rendering the AR object based on its altered presence.

(a) “obtain digital data representative of an environment of an AR capable mobile device, the digital data including a device location of the AR capable device and a virtual element attribute;” (1[c])

The combination of Yu and Sanz-Pastor renders claim 1[c] obvious. Like Yu,

Sanz-Pastor teaches that the AR platform system “obtain[s] digital data

representative of an environment of an AR capable mobile device, including a

device location of the AR capable device.” (E.g., Sanz-Pastor, ¶0054 (the server

“receives 702 an update request from the client device 300 that includes the device’s -59-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

current geographic location coordinates and view direction information, as well as content filtering settings.”); Zyda, ¶¶300-307.)

Sanz-Pastor further discloses that the digital data obtained includes a “virtual element attribute” to facilitate rendering occlusions. Sanz-Pastor teaches that the

server determines a “line of sight” between the device and a virtual object in order

“to simulate occlusion effects.” (Sanz-Pastor, Abstract.) This determination

includes obtaining data relating to “terrain elevation” and “three-dimensional

models of structures and vegetation specific to that location.” (Id., ¶¶0037, 0056,

Fig. 7.) A POSA would have understood that the data obtained in order to determine the line of sight is a “virtual element attribute.” (Zyda, ¶304.)

Rationale and Motivation to Combine Yu with Sanz-Pastor: It would have been obvious to combine Yu with Sanz-Pastor. The combination would have predictably resulted in an augmented reality (AR) platform system that carries out the steps recited in the Challenged Claims. (Zyda, ¶¶279-282.)

Yu and Sanz-Pastor are analogous references in the field of augmented reality and are directed to the same goal of enhancing the user experience on mobile augmented reality devices. (Yu, ¶¶003, 0016, 0018-0019, 0028, 0032, 0048,

Abstract; Sanz-Pastor, ¶¶0006, 0009, 0013, 0016, 0045, 0051, 0077.) Further, Yu and Sanz-Pastor teach analogous methods for implementing an AR platform system,

both of which track the claims of the ’051 Patent. For example, Yu and Sanz-Pastor

-60-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

both rely on GPS data to determine the position and orientation of the user device.

(Yu, ¶¶0015, 0030, 0045-47, 0060; Sanz-Pastor, ¶¶0005, 0016, 0042, 0045, 0076,

Fig. 3.) Yu and Sanz-Pastor also each maintain a repository that stores virtual

objects for presentation at the user devices, and both contemplate virtual objects that

take the form of location-based messages left by users. (Yu, ¶0036; Sanz-Pastor,

¶¶0036, 0050.)

A POSA looking to implement Yu’s location-based augmented reality system

would have naturally looked to Sanz-Pastor for implementation details helpful to

modifying or occluding the appearance of virtual objects, and would have had every

expectation that applying Sanz-Pastor’s teachings to Yu would be successful.

Indeed, Yu specifically states that “it will be understood by those skilled in the art

that various changes in form and details may be made therein without departing from

the spirit and scope of the invention.” (Yu, ¶0069.) Sanz-Pastor provides such

additional “details” that conform with the spirit and scope of the invention in Yu—

namely, enhancing the user experience on mobile augmented reality devices by

modifying the presence of virtual objects. (Zyda, ¶282.)

(b) “determine whether to alter presence of a relevant AR object based on at least the device location and the virtual element attribute; and” (1[f])

Like Yu, Sanz-Pastor teaches that the AR platform system “determine[s] whether to alter presence of a relevant AR object based on at least the device

-61-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

location and the virtual element attribute.” Sanz-Pastor specifically teaches using

the virtual element attribute data (such as terrain elevation) to assess a line of sight range and subsequently determine whether to make the AR object transparent (i.e., alter its presence).

Specifically, Sanz-Pastor uses the line of sight range (which, as explained above, is a function of virtual element attributes) to “determine[] whether the

[virtual] message is visible from the [user] device’s location.” (Sanz-Pastor, ¶0051.)

In turn, “[t]his can be used to occlude or modify the appearance of the message when displayed.” (Id.) A POSA would have understood that Sanz-Pastor discloses determining whether to alter the presence of a relevant virtual message through the

line of sight range determination based on at least the user device location and the

virtual element attribute. (Zyda, ¶¶320-322.)

Moreover, to the extent NantWorks argues that the term “alter[] presence”

requires “shades of transparency to indicate a suppressed presence,” a POSA would

have found claims 1[f] obvious over Yu in combination with Sanz-Pastor. (Zyda,

¶323.) For example, Sanz-Pastor discloses that the AR device “can also determine

occlusion information for each message, and not present occluded messages or

present them in an attenuated fashion by making them transparent when drawn or using a different color coding.” (Sanz-Pastor, ¶0029.)

A POSA would have found it natural to modify Yu with the teachings of Sanz-

-62-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

Pastor to display an AR object with “shades of transparency to indicate a suppressed

presence.” (Zyda, ¶324.) For example, Yu discloses displaying a virtual object (a

scroll saying, “try the chili”) on a real-world door, where the virtual shadow cast by

the scroll “changes with the time of day.” (Yu, Fig. 3; see also id., ¶0027.) A POSA

would readily recognize the benefits of allowing the user to view items already on

the door, such as hours of operation or daily specials. A skilled artisan would thus

turn to Sanz-Pastor for implementation details regarding how to “present [Yu’s

scroll] in an attenuated fashion by making [it] transparent[,]” thereby revealing any

relevant information on the real-world door. (Sanz-Pastor, ¶ 0029.)

(c) “cause the AR capable device to render the relevant AR object according to its altered presence.” (1[g])

Like Yu, Sanz-Pastor teaches that the AR platform system “cause[s] the AR

capable device to render the relevant AR object according to its altered

presence.” Sanz-Pastor discloses that the mobile device renders the virtual

messages according to its modified presence, determined according to the device’s

line of sight. (Sanz-Pastor, ¶¶0051 (describing “occlud[ing] or modify[ing] the

appearance of the message when displayed”); id., ¶¶0044-45, 0070, 0072.) A POSA

therefore would have understood that Sanz-Pastor discloses rendering the relevant

virtual message according to its altered presence. (Zyda, ¶¶327-330.)

Moreover, to the extent NantWorks argues that the term “alter[] presence”

requires “shades of transparency to indicate a suppressed presence,” a POSA would -63-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

have found claim 1[g] obvious over Yu in combination with Sanz-Pastor. (Zyda,

¶331.) For example, Sanz-Pastor discloses that the AR device can present the virtual

messages “in an attenuated fashion by making them transparent when drawn using

a different color coding.” (Sanz-Pastor, ¶0029; see also id., Fig. 7.) As noted above, a person of ordinary skill would have found this combination natural because Yu already discloses displaying AR objects on a real-world door, and making those AR objects transparent using Sanz-Pastor would have allowed users to view relevant information on the door, such as hours of operation or daily specials. (Zyda, ¶331.)

2. Claim 10: “The system of claim 1, wherein the relevant AR object is caused to be rendered within an overlapping augmented reality among multiple AR capable devices.”

Sanz-Pastor discloses an AR system that “simplifies team collaboration on

mobile devices, by allowing users to access and create geographically tagged

information.” (Sanz-Pastor, ¶0016; see also id., ¶0065.) Specifically, Sanz-Pastor

discloses “combining the observations of multiple team members in real time.” (Id.,

¶0072.) A POSA would have understood that Sanz-Pastor therefore discloses rendering virtual messages within an “overlapping” augmented reality among multiple team members, as described by the ’051 Patent. (See ’051, 20:64-21:1;

Zyda, ¶¶354-356.)

The motivation to combine Sanz-Pastor with Yu is the same as for claim 1.

Further, a POSA would have found it obvious to modify Yu with the teachings of

-64-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

Sanz-Pastor to render an AR object (virtual graffiti) within an overlapping

augmented reality among multiple AR capable devices. For example, one using Yu

to create virtual graffiti stating, “try the chili,” would have naturally sought a way to

share that message with multiple people at the same time, and would have turned to

Sanz-Pastor for implementation details, since Sanz-Pastor expressly contemplates

sharing messages among multiple team members. (Zyda, ¶356.)

3. Claim 11: “The system of claim 10, wherein the overlapping augmented reality comprises a team-based augmented reality.”

Sanz-Pastor discloses an AR system that “simplifies team collaboration on mobile devices, by allowing users to access and create geographically tagged

information.” (Sanz-Pastor, ¶0016; see also id. ¶¶0072, 0014.) Thus, a POSA would

have found it obvious to modify Yu with the teachings of Sanz-Pastor, resulting in

“a team-based augmented reality.” (Zyda, ¶¶360.) The motivation to combine

Sanz-Pastor with Yu is the same as for claims 1 and 10.

4. Claim 15: “The system of claim 1, wherein the relevant AR object is caused to be rendered according to a haptic format.”

A person of ordinary skill in the art would have understood that “haptic

format” refers to a format relating to human touch, such as a format created by

clicking a button on a user interface. (E.g., ’051, 9:48-51; Zyda, ¶364.) Sanz-Pastor

discloses that a user can click a button on the user interface to expand the virtual

message to view its full contents overlaid on the camera input. (Sanz-Pastor, ¶0046 -65-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(“In one embodiment, by centering a message on the crosshair at the center at the screen and clicking button 314 in the user interface 304, the user can expand it and see the message’s full contents overlaid on the camera input.”); see also id.,

¶0042 (“Device 300 is a computing device with a graphics-capable screen 302 and

a user interface 304, and preferably includes at least one button 314[.]”); id., ¶0030

(“At any time, a user can expand a particular message and see its contents in more detail overlaid in front of the scene, for example by centering the desired message icon on the screen 302 by pointing the camera 306 at it, and pushing a button 314 while the message is contained inside a selection target box located at the center of

the screen 302.”).) Because clicking a button is performed by the user’s touch, and

a POSA would have understood that “haptics” relates to human touch, Sanz-Pastor’s descriptions regarding button 314 disclose rendering the virtual messages according to a “haptic format.” (Zyda, ¶367.) The motivation to combine Sanz-Pastor with

Yu is the same as for claims 1 and 10.

5. Dependent Claims 5-9, 18, 22-23, 27-29, 34, 36, and 38, and Independent Claim 43

The additional challenged dependent claims all depend, directly or indirectly, from claim 1. The additional limitations of the dependent claims are satisfied by or

rendered obvious in light of Yu, and thus are likewise rendered obvious by the combination of Yu and Sanz-Pastor. (Zyda, ¶¶333-351, 369-395.) Likewise, independent claim 43, which as explained for Ground 1 is nearly identical to claim -66-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

1, is also rendered obvious in light of the combination of Yu and Sanz-Pastor. (Zyda,

¶¶396-408.) The motivation to combine Sanz-Pastor with Yu for these dependent

claims and independent claim 43 is the same as for claim 1 above.

D. Ground 3: Obviousness of Claims 1, 5-11, 15, 18, 22-29, 34-36, 38, 43 Over Yu in View of Sanz-Pastor, in Further View of Mullen

Ground 3 adds Mullen (Ex. 1005) with respect to dependent claims 8, 24-26,

and 35. The remaining claims are obvious Yu in light of Sanz-Pastor as explained

for Grounds 1 and 2.

1. Claim 8: “The system of claim 1, wherein the relevant AR object is caused to be rendered within an AR game.”

Mullen discloses augmented reality systems providing “location based

games” where “a user’s physical location correlates to the virtual location of a virtual

character on a virtual playfield.” (Mullen, Abstract, ¶0006.) Mullen further

discloses rendering “[v]irtual objects” within the “AR game system.” (Id., ¶0022

(“Thus, the video game system may construct a virtual world similar to the scanned

physical world and generate computer-controlled characters (and interactive or

impenetrable objects) on the physical world.”).) A POSA would have found it

obvious to combine Yu and Sanz-Pastor with Mullen to use Mullen’s game objects

as the AR objects in order to allow users of the AR platform system to play games.

(Zyda, ¶¶423-426.)

-67-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

Rationale and Motivation to Combine Yu and Sanz-Pastor with Mullen:

It would have been obvious to combine Yu and Sanz-Pastor with Mullen. The

combination would have predictably resulted in the AR platform system of Yu and

Sanz-Pastor having functionality to render virtual graffiti within an AR game. (See,

e.g., Yu, ¶0069; Mullen, Abstract.) The motivation to combine Yu and Sanz-Pastor is the same as described for Ground 2. (Zyda, ¶¶279-282.)

Yu and Mullen, like Sanz-Pastor, are analogous references in the field of

augmented reality. (Yu, Abstract; Mullen, ¶0006; see also Sanz-Pastor, ¶0016.)

Mullen, like Yu, discloses a mobile augmented reality system that modifies virtual

objects based on the user device’s position, as obtained from GPS data. (Mullen,

¶¶0006, 0011-0012; Yu, ¶0015.)

A POSA looking to implement Yu’s location-based augmented reality

systems would have naturally looked to Mullen for implementation details helpful

to altering the appearance of virtual objects within an AR game, and would have had

every expectation that applying Mullen’s teachings to Yu would be successful.

Indeed, Yu specifically states that “it will be understood by those skilled in the art

that various changes in form and details may be made therein without departing from

the spirit and scope of the invention.” (Yu, ¶0069.) Mullen provides additional

details that conform with the spirit, scope, and principles of the invention in Yu. For

example, as previously noted, Yu already discloses various social settings and

-68-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

interactions among friends using virtual graffiti on mobile phones. (Yu, ¶¶0003,

0017, 0024.) As another example, Yu expressly teaches that the virtual object can

“represent[] an avatar” (Yu, ¶0039), which were known to be common fixtures of

mobile games by April 2011. (Zyda, ¶429.) A POSA therefore would have naturally turned to Mullen for implementation details regarding how to use Yu’s AR platform system in an AR game. (Id.)

This motivation is likewise compatible with Sanz-Pastor’s virtual messages in an AR system. (Sanz-Pastor, ¶¶0014, 0016, 0026.) It would have been obvious to a POSA to apply the teachings of Mullen to adapt the AR platform systems of Yu and Yu in combination with Sanz-Pastor into an AR game where friends can interact in a social and/or collaborative setting. (Zyda, ¶429.)

2. Claim 24: “The system of claim 1, wherein the AR server is further configured to enable the AR capable device to conduct a commercial transaction with a commerce engine.”

Mullen discloses that the location-based AR game can include the exchange of virtual currency to buy or sell items. (Mullen, ¶0056 (“A location-based game

(e.g., an AR game) may have its own virtual currency system such that any user can exchange real money for the virtual currency system, with the game manufacturer taking a percentage cut, and the virtual currency can be used to buy/sell items.”).)

A POSA would have understood that Mullen’s AR game where users can

“buy/sell items” renders obvious the limitation of “conduct[ing] a commercial

-69-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

transaction.” A POSA would have further understood and found obvious that the

claimed “commerce engine” is the hardware component by which Mullen’s AR

game enables users to conduct commercial transactions. The motivation to combine

Mullen with Yu and Sanz-Pastor is the same as for claim 8. Additionally, a POSA

looking to conduct a commercial transaction in the AR system of Yu would have

naturally turned to Mullen for implementation details. (Zyda, ¶441; see also id.,

¶¶60-63.) This would have been a natural extension of Yu’s AR system given Yu’s

description of embodiments for use within the setting of a restaurant, where it is

desirable to facilitate commercial transactions. (Yu, ¶¶0017, 0024, 0027; Zyda,

¶442.)

3. Claim 25: “The system of claim 24, wherein the commercial transaction includes an exchange of virtual currency.”

As described for claim 24, Mullen discloses conducting a “commercial

transaction.” (See supra Section VII.D.2.) Mullen further discloses that the location-based AR game can include the exchange of “virtual currency” to buy or sell items. (Mullen, ¶0056 (“A location-based game (e.g., an AR game) may have its own virtual currency system such that any user can exchange real money for the virtual currency system, with the game manufacturer taking a percentage cut, and the virtual currency can be used to buy/sell items.”).) (Zyda, ¶¶444-448.) The motivation to combine Mullen with Yu and Sanz-Pastor is the same as for claims 8 and 24. -70-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

4. Claim 26: “The system of claim 24, wherein the commercial transaction includes a transfer of real-world funds.”

As described for claim 24, Mullen discloses conducting a “commercial

transaction.” (See supra Section VII.D.2.) Mullen discloses that, as part of the

virtual currency system, a user is given the option to “convert the virtual currency

back into real currency (e.g., the U.S. dollar)[.]” (Mullen, ¶0056.) A POSA would have appreciated that this “real currency” conversion is a transfer of real-world funds. (Zyda, ¶¶449-453.) The motivation to combine Mullen with Yu and Sanz-

Pastor is the same as for claims 8 and 24.

5. Dependent claim 35

(a) “The system of claim 34, wherein the interaction includes at least one of the following:” (Claim 35 preamble)

To the extent it is limiting, the preamble of claim 35 would have been obvious over Yu alone or in combination with Sanz-Pastor, as described for Grounds 1 and

2.

(b) “executing instructions of the rendered relevant AR object,” (Claim 35[a])

As explained for claim 1[b] in Ground 1, Yu teaches the execution of software

instructions in the AR platform system, such that a POSA would understand that Yu

satisfies executing instructions of the rendered relevant AR object. (Zyda, ¶461.)

-71-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(c) “presenting AR content via at least one of a speaker or tactile interface depending on nature of AR content,” (Claim 35[b])

Yu discloses that the virtual graffiti can include audio. (Yu, ¶¶0023, 0025,

0039; see also Sanz-Pastor, ¶0032.) Additionally, Mullen teaches that the AR game includes audio content. (Mullen, ¶0020 (stating that an add-on module can “select[] images (and audio) from the game system to display on the AR display, and determine how to display the selected images (and audio) on the AR display (and

AR sound system).) A POSA would have understood that the audio content is presented via a speaker. (Zyda, ¶463.)

Furthermore, Sanz-Pastor teaches that the virtual message is presented by clicking a button, which as explained for Ground 2 is a tactile interface. (Sanz-

Pastor, ¶0046 (“In one embodiment, by centering a message on the crosshair at the center at the screen and clicking button 314 in the user interface 304, the user can expand it and see the message’s full contents overlaid on the camera input.”).)

Additionally, Mullen teaches the use of “control buttons” in the AR game. (Mullen,

¶0047.) A person of ordinary skill in the art would have understood that Sanz-Pastor and Mullen teach presenting AR content via a tactile interface. The motivation to combine Mullen with Yu and Sanz-Pastor is the same as for claims 8 and 24.

-72-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(d) “sharing an AR object with other devices,” (Claim 35[c])

Claim 35[c] is satisfied by Sanz-Pastor. Sanz-Pastor discloses a “location- based wireless collaborative environment with a visual user interface.” (Sanz-

Pastor, Title (capitalizing removed).) The AR system of Sanz-Pastor “enabl[es] teams to share information while on location with the same flexibility and immediacy that e-mail and instant messaging might have brought to fixed location, office-based teams.” (Id., ¶0014.) Sanz-Pastor teaches that the same virtual message is displayed to multiple devices at the same location. (Id., ¶0072.) Sanz-Pastor further describes a “peer-to-peer network” where “clients share all the messages that they have in common, combining their local message databases.” (Id., ¶0053.)

(e) “presenting a purchasable product,” (Claim 35[d])

Claim 35[d] is satisfied by Mullen. As previously noted, Mullen discloses that the location-based AR game can include the exchange of virtual currency to buy or sell items. (Mullen, ¶0056 (“[T]he virtual currency can be used to buy/sell items”).) A POSA would have understood that Mullen’s AR game where users can

“buy/sell items” renders obvious the limitation of “presenting a purchasable product.” (Zyda, ¶472.) The motivation to combine Mullen with Yu and Sanz-

Pastor is the same as for claims 8 and 24.

-73-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

(f) “incorporating a shopping context,” (Claim 35[e])

As previously noted, Mullen discloses that the location-based AR game can

include the exchange of virtual currency to buy or sell items. (Mullen, ¶0056 (“[T]he

virtual currency can be used to buy/sell items”).) A POSA would have understood

that Mullen’s AR game where users can “buy/sell items” renders obvious the

limitation of “incorporating a shopping context.” (Zyda, ¶476.) The motivation to

combine Mullen with Yu and Sanz-Pastor is the same as for claims 8 and 24.

(g) “allowing access to data records,” (Claim 35[f])

Yu discloses that the virtual graffiti can include “URLs to 3D models and

content (e.g., photos, music files, etc.).” (Yu, ¶0054.) Similarly, Sanz-Pastor

discloses that “geographically tagged messages can contain, for example, text, audio,

video, pictures or a hyperlinked URL to additional content, in addition to their coordinates and time and date information.” (Sanz-Pastor, ¶0030.) A POSA would

have understood that this functionality “allow[s] access to data records,” such as

the photos and music files. (Zyda, ¶480.) The motivation to combine Yu and Sanz-

Pastor is the same as for claims 1 and 10 in Ground 2.

(h) “allowing a user to link to object information via an AR object,” (Claim 35[g])

As noted, Yu discloses that the virtual graffiti can include “URLs to 3D models and content (e.g., photos, music files, etc.).” (Yu, ¶0054.) Similarly, as noted, Sanz-Pastor discloses that “geographically tagged messages can contain, for

-74-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

example, text, audio, video, pictures or a hyperlinked URL to additional content, in addition to their coordinates and time and date information.” (Sanz-Pastor,

¶0030.) A POSA would have understood that this functionality “allow[s] a user to link to object information via an AR object.” (Zyda, ¶484.) The motivation to combine Yu and Sanz-Pastor is the same as for claims 1 and 10 in Ground 2.

(i) “interacting with a remote computing device, and” (Claim 35[h])

Yu discloses that the user device interacts with a remote computing device.

For example, Yu discloses that the user device can interact with “URLs to 3D models and content” (Yu, ¶0054), which a person of ordinary skill in the art would have understood to be located at a remote computing device. Similarly, Sanz-Pastor discloses that the client device interacts with a remote computing device. For example, Sanz-Pastor discloses that the client device can interact with hyperlinked content (Sanz-Pastor, ¶0030), which a person of ordinary skill in the art would have understood to be located at a remote computing device. (Zyda, ¶488.) The motivation to combine Yu and Sanz-Pastor is the same as for claims 1 and 10 in

Ground 2.

(j) “managing AR objects.” (Claim 35[i])

Yu discloses that the user device manages the virtual graffiti by creating the

graffiti, viewing the graffiti, and/or interacting with a URL included in the graffiti.

(Yu, ¶¶0024, 0053; see also id., Fig. 2.) Similarly, Sanz-Pastor discloses that the -75-

Petition for Inter Partes Review of U.S. Patent No. 10,403,051 B2

client device manages the virtual messages by creating the messages, viewing the messages, and/or interacting with a URL included in the messages. (Sanz-Pastor,

¶¶0015, 0016, 0029, 0030.) A POSA would have understood that Yu and Sanz-

Pastor each teach “managing AR objects” as described by the ’051 Patent. (Zyda,

¶492.) The motivation to combine Yu and Sanz-Pastor is the same as for claims 1 and 10 in Ground 2.10

VII. CONCLUSION

Petitioner respectfully requests institution of review on the challenged claims.

Dated: June 15, 2021 Respectfully submitted,

COOLEY LLP ATTN: Patent Group By: / Heidi L. Keefe / 1299 Pennsylvania Avenue NW Heidi L. Keefe Suite 700 Reg. No. 40,673 Washington, DC 20004 Counsel for Petitioner Tel: (650) 843-5001 Fax: (650) 849-7400

10 As explained by Dr. Zyda, no secondary considerations of non-obviousness exist.

(Zyda, ¶¶498-410.)

-76-

CERTIFICATE OF COMPLIANCE WITH WORD COUNT

Pursuant to 37 C.F.R. § 42.24(d), I certify that this petition complies with the type-volume limits of 37 C.F.R. § 42.24(a)(1)(i) because it contains 12,494 words, according to the word-processing system used to prepare this petition, excluding the parts of this petition that are exempted by 37 C.F.R. § 42.24(a) (including the table of contents, a table of authorities, mandatory notices, a certificate of service or this certificate word count, appendix of exhibits, and claim listings).

DATED: June 15, 2021

COOLEY LLP / Heidi L. Keefe / ATTN: Patent Docketing Heidi L. Keefe 1299 Pennsylvania Avenue NW Reg. No. 40,673 Suite 700 Washington, D.C. 20004 Tel: (650) 843-5001 Fax: (650) 849-7400

-77-

CERTIFICATE OF SERVICE

I hereby certify, pursuant to 37 C.F.R. Sections 42.6 and 42.105, that a complete copy of the attached PETITION FOR INTER PARTES REVIEW OF U.S. PATENT NO. 10,403,051 B2, including all exhibits (Nos. 1001-1005, 1008- 1024, 1026-1033, 1038-1039, 1042-1047) and related documents, are being served via Federal Express on the 15th day of June, 2021, the same day as the filing of the above-identified document in the United States Patent and Trademark Office/Patent Trial and Appeal Board, upon Patent Owner by serving the correspondence address of record with the USPTO as follows:

Mauriel Kapouytian Woods LLP 15 W. 26th Street Floor 7 New York, NY 10010

And, via Federal Express upon counsel of record for both Patent Owner Nant Holdings IP, LLC and NantWorks, LLC in the litigation pending before the U.S. District Court for the Northern District of California entitled NantWorks, LLC, and Nant Holdings IP, LLC v. Niantic, Inc., Case No. 3:20-cv-06262-LB as follows:

Matthew K. Blackburn Evan E. Boetticher Diamond McCarthy LLP 150 California Street, Suite 2200 San Francisco, CA 94111

Allan B. Diamond John B. Sample, IV Diamond McCarthy LLP Two Houston Center 909 Fannin Street, 37th Floor Houston, TX 77010

DATED: June 15, 2021 / Heidi L. Keefe / Heidi L. Keefe Reg. No. 40,673 COOLEY LLP

-78-