US009164,506B1

(12) United States Patent (10) Patent No.: US 9,164,506 B1 Zang (45) Date of Patent: Oct. 20, 2015

(54) SYSTEMS AND METHODS FORTARGET 2009/0187299 A1* 7/2009 Fregene et al...... TO1/23 TRACKING 2010, 0004802 A1 1/2010 Bodin et al. 2010/025.0022 A1* 9, 2010 Hines et al...... 7O1/2 2011/O141287 A1* 6, 2011 Dunkel et al. . ... 348,169 (71) Applicant: SZ, DJITECHNOLOGY Co., Ltd, 2011/0304737 A1* 12/2011 Evans et al...... 348,169 Shenzhen (CN) 2012/0143808 A1* 6, 2012 Karins et al...... TO6/46 2012/0154579 A1* 6/2012 Hampapur et al...... 348/143 (72) Inventor: Bo Zang, Shenzhen (CN) 2012/0200703 A1 8, 2012 Nadir et al. 2012/0287274 A1* 11/2012 Bevirt ...... 348,144 (73) Assignee: SZ, DJITECHNOLOGY CO.,LTD, 2012/0307042 A1* 12/2012 Lee et al...... 348,114 Shenzhen (CN) (Continued) (*) Notice: Subject to any disclaimer, the term of this FOREIGN PATENT DOCUMENTS patent is extended or adjusted under 35 CN 102809969. A 12/2012 U.S.C. 154(b) by 0 days. CN 1031499.39 A 6, 2013 (21) Appl. No.: 14/471.954 WO WO 2010/089738 A2 8, 2010 OTHER PUBLICATIONS (22) Filed: Aug. 28, 2014 Rafi, et al. Autonomous target following by unmanned aerial Related U.S. Application Data vehicles. In Proceedings of the SPIE, May 2006. (63) Continuation of application No. (Continued) PCT/CN2014/0833 15, filed on Jul 30, 2014. Primary Examiner — Thomas G Black (51) Int. Cl. Assistant Examiner — Peter D Nolan G05D L/12 (2006.01) (74) Attorney, Agent, or Firm — Wilson Sonsini Goodrich & G05D I/00 (2006.01) Rosati (52) U.S. Cl. CPC ...... G05D I/0038 (2013.01); G05D I/0094 (57) ABSTRACT (2013.01); G05D 1/12 (2013.01) The present invention provides systems, methods, and (58) Field of Classification Search devices related to target tracking by UAVs. The UAV may be CPC ...... G05D 1/0094; G05D 1/12 configured to receive target information from a control ter USPC ...... 701/2, 11: 348/113, 114, 144, 169, 170, minal related to a target to be tracked by an imaging device 348/171, 172, 211.99 coupled to the UAV. The target information may be used by See application file for complete search history. the UAV to automatically track the target so as to maintain predetermined position and/or size of the target within one or (56) References Cited more images captured by the imaging device. The control terminal may be configured to display images from the imag U.S. PATENT DOCUMENTS ing device as well as allowing user input related to the target 7.970,507 B2 * 6/2011 Fregene et al...... TO1/23 information. 2008/0054158 A1* 3/2008 Ariyur et al...... 250,203.1 2009/0157233 A1* 6/2009 Kokkeby et al...... TO1/3 30 Claims, 18 Drawing Sheets

-100

US 9,164,506 B1 Page 2

(56) References Cited OTHER PUBLICATIONS U.S. PATENT DOCUMENTS International search report and written opinion dated May 6, 2015 for PCT/CN2014/0833.15.

2013/0085643 A1* 4/2013 Mathews ...... TO1/49 2013/017.6423 A1* 7, 2013 Rischmuller et al...... 348/114 2014/0049643 A1* 2/2014 Segerstrom et al...... 348,144 * cited by examiner U.S. Patent Oct. 20, 2015 Sheet 1 of 18 US 9,164,506 B1

1. 100

FIG. 1 U.S. Patent Oct. 20, 2015 Sheet 2 of 18 US 9,164,506 B1

1. 200

.204 HD O8 HD 210

FIG. 2 U.S. Patent Oct. 20, 2015 Sheet 3 of 18 US 9,164,506 B1

1. 300

FIG. 3 U.S. Patent Oct. 20, 2015 Sheet 4 of 18 US 9,164,506 B1

400

Obtain target information

Identify target based on target information

Detect deviation of target from predetermined position and/or size

Generate Commands for UAV, carrier and/or imaging device to Substantially Correct the deviation 4.08

FIG. 4 U.S. Patent Oct. 20, 2015 Sheet 5 of 18 US 9,164,506 B1

FIG. 5 U.S. Patent Oct. 20, 2015 Sheet 6 of 18 US 9,164,506 B1

Y 606

Po (uo, Vo)6O4 U.S. Patent Oct. 20, 2015 Sheet 7 of 18 US 9,164,506 B1

1. 700

712 X708

FIG. 7 U.S. Patent Oct. 20, 2015 Sheet 8 of 18 US 9,164,506 B1

1. 800

Receive user navigation Commands and target information

Control movable object according to navigation Commands

Adjust movable object, carrier and/or imaging device to track target according to target information

FIG. 8 U.S. Patent Oct. 20, 2015 Sheet 9 of 18 US 9,164,506 B1

1. 900

Receive tracking input

Generate navigation commands based On navigation input

Generate target information based on tracking input

Provide navigation commands and target information 91O

FIG. 9 U.S. Patent Oct. 20, 2015 Sheet 10 of 18 US 9,164,506 B1

? 1000

Display images captured by movable object

Receive user selection of a target

Generate target information based on user Selection

Provide target information to the movable object

FIG. 10 U.S. Patent Oct. 20, 2015 Sheet 11 of 18 US 9,164,506 B1

- 1100

Receive images captured by UAV

Receive tracking data

Display images with tracking data

FIG. 11 U.S. Patent Oct. 20, 2015 Sheet 12 of 18 US 9,164,506 B1

1. 1200

12O2

1210

12O6

1208

1205 1204

FIG. 12 U.S. Patent Oct. 20, 2015 Sheet 13 of 18 US 9,164,506 B1

1304

Oct. 20, 2015 Sheet 14 of 18 US 9,164,506 B1

U.S. Patent Oct. 20, 2015 Sheet 15 of 18 US 9,164,506 B1

1. 1500

1506 1506

15121 1 2

FIG. 15 U.S. Patent Oct. 20, 2015 Sheet 16 of 18 US 9,164,506 B1

1. 1600

FG 16 U.S. Patent Oct. 20, 2015 Sheet 17 of 18 US 9,164,506 B1

1. 1700

FIG. 17 U.S. Patent Oct. 20, 2015 Sheet 18 of 18 US 9,164,506 B1

-10

FIG. 18 US 9,164,506 B1 1. 2 SYSTEMIS AND METHODS FORTARGET along a flight path, and (2) target information of a target to be TRACKING tracked by an imaging device on the UAV; and one or more processors, individually or collectively, configured to track CROSS-REFERENCE the target according to the target information by automati cally adjusting at least one of the UAV or the imaging device This application is a continuation application of Interna while the UAV moves along the flight path according to the tional Application No. PCT/CN2014/083315, filed on Jul.30, one or more navigation commands from the remote user. 2014, the content of which is hereby incorporated by refer According to another aspect of the present invention, S ence in its entirety. system for controlling an unmanned aerial vehicle (UAV) is 10 provided. The system comprises: one or more receivers, indi BACKGROUND OF THE INVENTION vidually or collectively, configured to receive from a remote user (1) one or more navigation commands to move the UAV Aerial vehicles such as unmanned aerial vehicles (UAVs) along a flight path, and (2) target information of a target to be can be used for performing Surveillance, reconnaissance, and tracked by an imaging device on the UAV; and one or more exploration tasks for military and civilian applications. Such 15 processors, individually or collectively, configured to track aerial vehicles may carry a payload configured to perform a the target according to the target information by automati specific function Such as capturing images of Surrounding cally adjusting at least one of the UAV or the imaging device environment. while the UAV moves along the flight path according to the In some instances, it may be desirable for aerial vehicles to one or more navigation commands from the remote user. track a specific target. For Small-sized aerial vehicles. Such In some embodiments, the imaging device includes a cam tracking is traditionally achieved via control commands from era or a . a user-operated remote control terminal or device. Such In some embodiments, the one or more navigation com manual tracking control may become difficult in certain cir mands are adapted to control a speed, position or attitude or cumstances, such as when the movable object or target is the UAV. moving quickly or when the movable object is at least par 25 In some embodiments, the target is Substantially stationary tially blocked from view of the user. Furthermore, the atten relative to a reference object. tion necessary for Such manual tracking typically requires a In some embodiments, the target is moving relative to a dedicated user that controls a camera that onboard the aerial reference object. vehicle separate from a user that controls the navigation of the In some embodiments, the target information includes ini aerial vehicle, thereby increasing the cost for aerial photog 30 tial target information. raphy and other applications of the aerial vehicles. In some embodiments, the initial target information includes an initial position or an initial size of the target SUMMARY OF THE INVENTION within an image captured by the imaging device. In some embodiments, the target information includes tar In some instances, it may be desirable for aerial vehicles to 35 get type information. track a specific target. Thus, a need exists for improved UAV In some embodiments, tracking the target according to the tracking methods and systems that provides automatic or target information further includes identifying, based on the semi-automatic tracking of a target, thereby relieving opera target type information, the target to track from within one or tors of the aerial vehicles of manually tracking the targets. more images captured by the imaging device using an image The present invention provides systems, methods, and 40 recognition algorithm. devices related to target tracking by UAVs. The UAV may be In Some embodiments, the target type information includes configured to receive target information from a control ter color, texture, or pattern information. minal related to a target to be tracked by an imaging device In Some embodiments, the target information includes coupled to the UAV. The target information may be used by expected target information. the UAV to automatically track the target so as to maintain 45 In some embodiments, the expected target information predetermined position and/or size of the target within one or includes an expected position oran expected size of the target more images captured by the imaging device. Any description within an image captured by the imaging device. of tracking may include visual tracking by the imaging In some embodiments, the expected size of the target is the device. The control terminal may be configured to display same as an initial size of the target. images from the imaging device as well as allowing user input 50 In Some embodiments, the expected position of the target is related to the target information. the same as an initial position of the target. According to an aspect of the present invention, a method In some embodiments, tracking the target according to the for controlling an unmanned aerial vehicle (UAV) is pro target information includes maintaining, within a predeter vided. The method comprises: receiving, from a remote user, mined degree of tolerance, the expected position, or the one or more navigation commands to move the UAV along a 55 expected size of the target within one or more images cap flight path; receiving, from the remote user, target informa tured by the imaging device. tion of a target to be tracked by animaging device on the UAV. In Some embodiments, the imaging device is coupled to the and tracking the target according to the target information by UAV via a carrier configured to permit the imaging device to automatically adjusting at least one of the UAV or the imaging move relative to the UAV. device while the UAV moves along the flight path according 60 In some embodiments, the carrier is configured to permit to the one or more navigation commands from the remote the imaging device to rotate around at least two axes relative USC. to the UAV. According to another aspect of the present invention, an In some embodiments, tracking the target according to the unmanned aerial vehicle (UAV) with tracking capabilities is target information includes automatically adjusting at least provided. The UAV comprises: one or more receivers, indi 65 one of the UAV, the carrier, or the imaging device while the vidually or collectively, configured to receive from a remote UAV moves along the flight path according to the one or more user (1) one or more navigation commands to move the UAV navigation commands from the remote user. US 9,164,506 B1 3 4 In some embodiments, the target information includes In some embodiments, the remote control device is con expected target information and tracking the target according figured to: receive one or more images captured by the imag to the target information comprises: determining current tar ing device from the UAV, display the one or more images; get information of the target based on one or more images receive a user selection of a target from within a displayed captured by the imaging device; detecting a deviation of the image; generate the target information of the target based on current target information from the expected target informa the user selection of the target; and transmit the target infor tion; and calculating an adjustment to the UAV, the carrier, or mation to the UAV. the imaging device so as to Substantially correct the deviation. In some embodiments, the remote control device is further In some embodiments, the deviation is related to a change configured to generate the one or more navigation commands 10 based on user input and to transmit the one or more navigation in position of the target and the calculated adjustment is commands to the UAV. related to an angular velocity for the UAV. In some embodiments, the remote control device is further In some embodiments, the angular Velocity is relative to a configured to receive tracking information related to the tar yaw axis of the UAV. get and to display the one or more images with the tracking In some embodiments, the angular Velocity is relative to a 15 information. pitch axis of the UAV. According to an aspect of the present invention, an In some embodiments, the deviation is related to a change unmanned aerial vehicle (UAV) with tracking capabilities is in position of the target and the calculated adjustment is provided. The UAV comprises: one or more receivers, indi related to an angular Velocity for the imaging device relative vidually or collectively, configured to receive, from a remote to the UAV. user, user-specified target information of a target to be tracked In Some embodiments, the calculated adjustment is used to by an imaging device on the UAV, the user-specified target generate control signals for the carrier so as to cause the information including a predetermined position or a prede imaging device to move relative to the UAV. termined size of the target within an image captured by the In some embodiments, the deviation is related to a change imaging device, the imaging device coupled to the UAV via a in size of the target and the adjustment is related to a linear 25 carrier configured to permit the imaging device to move rela velocity for the UAV. tive to the UAV; and one or more processors, individually or In some embodiments, the deviation is related to a change collectively, configured to: detect a deviation from the prede in size of the target and the adjustment is related to one or termined position or the predetermined size of the target more parameters of the imaging device. based on one or more images captured by the imaging device; In some embodiments, the one or more parameters of the 30 and generate commands to automatically adjust the UAV, the imaging device include focal length, Zoom, or focus. carrier, or the imaging device so as to Substantially correct the In some embodiments, the calculated adjustment is limited detected deviation from the predetermined position or the to a predetermined range. predetermined size of the target. In some embodiments, the predetermined range corre According to another aspect of the present invention, a sponds to a predetermined range of control lever amount of a 35 system for controlling an unmanned aerial vehicle (UAV) is control system. provided. The system comprises: one or more receivers, indi In some embodiments, the control system includes a flight vidually or collectively, configured to receive, from a remote control system for the UAV or a control system for the carrier. user, user-specified target information of a target to be tracked In some embodiments, a warning signal is provided if the by an imaging device on the UAV, the user-specified target calculated adjustment falls outside the predetermined range. 40 information including a predetermined position or a prede In some embodiments, tracking the target comprises com termined size of the target within an image captured by the paring the calculated adjustment to a predetermined maxi imaging device, the imaging device coupled to the UAV via a mum threshold value and providing the predetermined maxi carrier configured to permit the imaging device to move rela mum threshold value if the calculated adjustment exceeds the tive to the UAV; and one or more processors, individually or predetermined maximum threshold value. 45 collectively, configured to: detect a deviation from the prede In some embodiments, the predetermined maximum termined position or the predetermined size of the target threshold value includes a maximum angular velocity or a based on one or more images captured by the imaging device; maximum linear velocity for the UAV or the imaging device. and generate commands to automatically adjust the UAV, the In some embodiments, tracking the target comprises com carrier, or the imaging device so as to Substantially correct the paring the calculated adjustment to a predetermined mini 50 detected deviation from the predetermined position or the mum threshold value and providing the predetermined mini predetermined size of the target. mum threshold value if the calculated adjustment is less than According to another aspect of the present invention, a the predetermined minimum threshold value. method for controlling an unmanned aerial vehicle (UAV) is In some embodiments, the predetermined minimum provided. The method comprises: receiving, from a remote threshold value includes a minimum angular Velocity or a 55 user, user-specified target information of a target to be tracked minimum linear velocity for the UAV or the imaging device. by an imaging device on the UAV, the user-specified target In some embodiments, the target information is received information including a predetermined position or predeter from a remote control device accessible to the remote user. mined size of the target within an image captured by the In some embodiments, the one or more navigation com imaging device, the imaging device coupled to the UAV via a mands are received from the same remote control device. 60 carrier configured to permit the imaging device to move rela In some embodiments, the one or more navigation com tive to the UAV; detecting, by a processor onboard the UAV, a mands are received from a different remote control device. deviation from the predetermined position or the predeter In some embodiments, the remote control device is con mined size of the target based on one or more images captured figured to receive user input from a touchscreen, joystick, by the imaging device; and automatically adjusting the UAV. keyboard, mouse, or stylus. 65 the carrier, or the imaging device so as to Substantially correct In some embodiments, the remote control device is con the detected deviation from the predetermined position or the figured to receive user input from a wearable device. predetermined size of the target. US 9,164,506 B1 5 6 In some embodiments, the imaging device includes a cam In some embodiments, the one or more parameters of the era or a camcorder. imaging device include focal length, Zoom level, imaging In some embodiments, the method further comprises mode, image resolution, focus, depth of field, exposure, lens receiving, from the remote user, one or more commands speed, or field of view. adapted to control a speed, position, orientation or attitude or 5 In some embodiments, the calculated adjustment is limited the UAV. to a predetermined range. In some embodiments, the method further comprises In some embodiments, the predetermined range corre receiving, from the remote user, one or more commands sponds to a predetermined range of control lever amount of a adapted to control a speed, position, orientation or attitude or control system. the carrier. 10 In some embodiments, the control system includes a navi In some embodiments, the method further comprises gation control system for the UAV or a control system for the receiving, from the remote user, one or more commands carrier. adapted to control one or more operational parameters of the In some embodiments, the method further comprises pro imaging device. 15 viding a warning signal if the adjustment falls outside the In Some embodiments, the one or more operational param predetermined range. eters of the imaging device include focal length, Zoom level. In some embodiments, the warning signal is used to pro imaging mode, image resolution, focus, depth of field, expo vide an audio or visual signal. sure, lens speed, or field of view. In some embodiments, the warning signal is used to pro In some embodiments, the carrier is configured to permit vide a kinetic signal. the imaging device to rotate around at least one axes relative In Some embodiments, the method further comprises trans to the UAV. mitting, in Substantially real-time, images captured by the In some embodiments, the carrier is configured to permit imaging device to a remote user device accessible to the the imaging device to rotate around at least two axes relative remote user. to the UAV. 25 In some embodiments, the remote user device comprises a In some embodiments, the target information of the target display for displaying the images captured by the imaging further includes target type information. device. In some embodiments, the target type information includes In some embodiments, the remote user device comprises a color or texture of the target. an input device for providing the target information. In some embodiments, the predetermined position of the 30 In some embodiments, the input device includes a touch target includes an initial position or an expected position of screen, joystick, keyboard, mouse, or stylus. In some embodiments, the input device includes a wear the target. able device. In some embodiments, the predetermined size of the target In some embodiments, the target information is provided includes an initial size or an expected size of the target. 35 based on the transmitted images. In some embodiments, detecting the deviation from the In some embodiments, the method further comprises pro predetermined position or the predetermined size of the target viding, in Substantially real-time, tracking information of the comprises comparing a position or size of the target within the target to the remote user device. one or more images captured by the imaging device with the In some embodiments, the remote user device is configured predetermined position or predetermined size, respectively. 40 to: receive a user selection of the target from within one or In some embodiments, adjusting the UAV, the carrier, or more images displayed on the remote user device; and gen the imaging device comprises calculating an adjustment to erate the target information of the target based on the user the UAV, the carrier, or the imaging device so as to Substan selection of the target. tially correct the deviation. According to another aspect of the present invention, a In some embodiments, the deviation is related to a change 45 method for controlling an unmanned aerial vehicle (UAV) is in position of the target and the adjustment is related to an provided. The method comprises: displaying, via a display, angular velocity for the UAV. one or more images captured by an imaging device coupled to In some embodiments, the angular Velocity is relative to a the UAV in Substantially real-time; receiving, via an input yaw axis of the UAV. device, a user selection of a target from within at least one of In some embodiments, the angular Velocity is relative to a 50 the one or more images being displayed in Substantially real pitch axis of the UAV. time; generating target information of the target based at least In some embodiments, the deviation is related to a change in part on the user selection of the target; and providing the in position of the target and the adjustment is related to an target information to the UAV so as to allow the UAV autono angular velocity for the imaging device relative to the UAV. mously track the target according to the target information. In some embodiments, the adjustment is used to generate 55 According to another aspect of the present invention, a control signals for the carrier so as to cause the imaging system for controlling an unmanned aerial vehicle (UAV) is device to move relative to the UAV. provided. The system comprises: a display configured to dis In some embodiments, the angular Velocity is relative to a play one or more images captured by an imaging device yaw axis of the imaging device. coupled to the UAV; an input device configured to receive a In some embodiments, the angular Velocity is relative to a 60 user selection of a target from within at least one of the one or pitch axis of the imaging device. more images being displayed on the display; one or more In some embodiments, the deviation is related to a change processors, individually or collectively, configured to gener in size of the target and the adjustment is related to a linear ate target information of the target based at least in part on the velocity for the UAV. user selection of the target; and a transmitter configured to In some embodiments, the deviation is related to a change 65 provide the target information to the UAV so as to allow the in size of the target and the adjustment is related to one or UAV autonomously track the target according to the target more parameters of the imaging device. information. US 9,164,506 B1 7 8 According to another aspect of the present invention, an coupled to the UAV; receiving user-specified target informa apparatus for controlling an unmanned aerial vehicle (UAV) tion of a target including a predetermined position or a pre is provided. The apparatus comprises: a display configured to determined size of a target within an image from the one or display one or more images captured by an imaging device more images; providing the user-specified target information coupled to the UAV: an input device configured to receive a to the UAV; and displaying the one or more images and a user selection of a target from within at least one of the one or tracking indicator associated with the target within the one or more images being displayed on the display; one or more more images, the tracking indicator indicating that the target processors, individually or collectively, configured to gener is being tracked by the UAV in substantially real-time accord ate target information of the target based at least in part on the ing to the user-specified target information. user selection of the target; and a transmitter configured to 10 provide the target information to the UAV so as to allow the According to another aspect of the present invention, an UAV autonomously track the target according to the target apparatus for controlling an unmanned aerial vehicle (UAV) information. is provided. The apparatus comprises: a receiver configured In some embodiments, the target information includes ini to receive, in Substantially real-time, one or more images tial target information. 15 captured by an imaging device coupled to the UAV; an input In some embodiments, the initial target information device configured to receive user-specified target information includes an initial position or an initial size of the target of a target including a predetermined position or a predeter within an image captured by the imaging device. mined size of a target within an image from the one or more In some embodiments, the initial target information is gen images; a transmitter for provide the user-specified target erated based on the user selection of the target. information to the UAV; and a display configured to display In some embodiments, the target information includes tar the one or more images and a tracking indicator associated get type information. with the target within the one or more images, the tracking In some embodiments, the target type information includes indicator indicating that the target is being tracked by the color, texture, or pattern information. UAV in substantially real-time according to the user-specified In some embodiments, the target type information is gen 25 target information. erated based on the user selection of the target. According to another aspect of the present invention, a In some embodiments, the target information includes system for controlling an unmanned aerial vehicle (UAV) is expected target information. provided. The system comprises: a receiver configured to In some embodiments, the expected target information is receive, in Substantially real-time, one or more images cap generated based on the user selection of the target. 30 tured by an imaging device coupled to the UAV; an input In some embodiments, the expected target information device configured to receive user-specified target information includes an expected position or an expected size of the target of a target including a predetermined position or a predeter within an image captured by the imaging device. mined size of a target within an image from the one or more In some embodiments, the target information does not images; a transmitter for provide the user-specified target include expected target information. 35 information to the UAV; and a display configured to display In some embodiments, the input device includes a touch the one or more images and a tracking indicator associated screen, joystick, keyboard, mouse, stylus, or wearable device. with the target within the one or more images, the tracking In some embodiments, the user selection of the target is indicator indicating that the target is being tracked by the achieved by a user selecting an area of the at least one of the UAV in substantially real-time according to the user-specified one or more images being displayed on the display, the 40 target information. selected area corresponding to the target. In some embodiments, the input device is further config In some embodiments, the user selection of the target is ured to receive one or more commands adapted to control a achieved by a user directly touching an area of the at least one speed, position, orientation or attitude or the UAV, or one or of the one or more images being displayed on the display, the more operational parameters of the imaging device. touched area corresponding to the target. 45 In Some embodiments, the imaging device is coupled to the In some embodiments, the user selects the area using a UAV via a carrier configured to permit the imaging device to stylus, mouse, keyboard, or a wearable device. rotate relative to the UAV along at least one axis and wherein In some embodiments, selecting the area includes touch the input device is further configured to receive one or more ing, Swiping, circling, or clicking in the area. commands adapted to control a speed, position, orientation, In some embodiments, the one or more processors, indi 50 or attitude of the carrier. vidually or collectively, are further configured to display, on In some embodiments, the carrier is configured to permit the display, the selected target with a selection indicator in the imaging device to rotate around at least two axes relative response to the user selection of the target, the selection to the UAV. indicator indicating that the target has been selected by the In some embodiments, the target is tracked by the imaging USC. 55 device according to the target information via automatic In some embodiments, the one or more processors, indi adjustment to at least one of the UAV, the carrier, or the vidually or collectively, are further configured to receive imaging device. tracking information related to the target and, based on the In some embodiments, a second input device is included tracking information, display the selected target with a track and configured to receive one or more commands adapted to ing indicator within one or more Subsequent images captured 60 control a speed, position, orientation or attitude or the UAV, or by the imaging device, the tracking indicator indicating, in one or more operational parameters of the imaging device. Substantially real-time, that the target is being tracked by the In Some embodiments, the imaging device is coupled to the UAV according to the target information. UAV via a carrier configured to permit the imaging device to According to another aspect of the present invention, a rotate relative to the UAV along at least one axis and wherein method for controlling an unmanned aerial vehicle (UAV) is 65 the system further comprise a second input device configured provided. The method comprises: receiving, in Substantially to receive one or more commands adapted to control a speed, real-time, one or more images captured by an imaging device position, orientation or attitude of the carrier. US 9,164,506 B1 9 10 In Some embodiments, the one or more operational param that sets forth illustrative embodiments, in which the prin eters of the imaging device include focal length, Zoom level. ciples of the invention are utilized, and the accompanying imaging mode, image resolution, focus, depth of field, expo drawings of which: sure, lens speed, or field of view. FIG. 1 illustrates an exemplary target tracking system, in In some embodiments, the target is tracked by the imaging accordance with embodiments. device according to the target information via automatic FIG. 2 illustrates exemplary flow of image-related data adjustment to at least one of the UAV, the carrier, or the among components in a tracking system, in accordance with imaging device. embodiments. In some embodiments, the predetermined position FIG. 3 illustrates exemplary flow of control data among includes an initial position of the target. 10 components in a tracking system, in accordance with embodi mentS. In some embodiments, the predetermined size includes an FIG. 4 illustrates an exemplary process for implementing initial size of the target. target tracking, in accordance with embodiments. In some embodiments, the predetermined position FIG. 5 illustrates an exemplary configuration of a movable includes an expected position of the target. 15 object, carrier, and payload, in accordance with embodi In some embodiments, the predetermined size includes an mentS. expected size of the target. FIG. 6 illustrates an exemplary tracking method for main In some embodiments, the target information further taining an expected position of a target, in accordance with includes target type information. embodiments. In some embodiments, the target information is generated FIG. 7 illustrates an exemplary tracking method for main based on a user selection of the target. taining an expected size of a target, in accordance with In some embodiments, the tracking indicator includes a embodiments. geometric shape, a check mark, or an arrow. FIG. 8 illustrates another exemplary process for imple In some embodiments, the geometric shape includes a menting target tracking, in accordance with embodiments. circle, a rectangle, or a triangle. 25 FIG. 9 illustrates an exemplary process for controlling a In some embodiments, the target is tracked by the imaging movable object to navigate and track, in accordance with device according to the target information via automatic embodiments. adjustment to at least one of the UAV or the imaging device. FIG. 10 illustrates an exemplary process for selecting a It shall be understood that different aspects of the invention target, in accordance with embodiments. can be appreciated individually, collectively, or in combina 30 FIG. 11 illustrates an exemplary process for viewing a tion with each other. Various aspects of the invention tracked target, in accordance with embodiments. described herein may be applied to any of the particular FIG. 12 illustrates an exemplary control terminal for con applications set forth below or for any other types of movable trolling a movable object, in accordance with embodiments. objects. Any description herein of aerial vehicles, such as FIGS. 13 A-C illustrate exemplary methods for selecting a unmanned aerial vehicles, may apply to and be used for any 35 target using a user interface, in accordance with some movable object, such as any vehicle. Additionally, the sys embodiments. tems, devices, and methods disclosed herein in the context of FIG. 14 illustrates a UAV, in accordance with embodi aerial motion (e.g., flight) may also be applied in the context mentS. of other types of motion, Such as movement on the ground or FIG. 15 illustrates a movable object including a carrier and on water, underwater motion, or motion in space. 40 a payload, in accordance with embodiments. Additional aspects and advantages of the present disclo FIG. 16 illustrates an exemplary system for tracking a sure will become readily apparent to those skilled in this art movable object, in accordance with embodiments. from the following detailed description, wherein only exem FIG. 17 illustrates an exemplary system for controlling a plary embodiments of the present disclosure are shown and movable object, in accordance with embodiments. described, simply by way of illustration of the best mode 45 FIG. 18 illustrates an exemplary use case for the present contemplated for carrying out the present disclosure. As will invention. be realized, the present disclosure is capable of other and different embodiments, and its several details are capable of DETAILED DESCRIPTION OF THE INVENTION modifications in various obvious respects, all without depart ing from the disclosure. Accordingly, the drawings and 50 The present invention provides systems, methods, and description are to be regarded as illustrative in nature, and not devices related to target tracking by unmanned aerial vehicles as restrictive. (UAVs). A UAV may be configured to receive target informa tion from a control terminal related to a target to be tracked by INCORPORATION BY REFERENCE animaging device coupled to the UAV. The target information 55 may be used by the UAV to automatically cause the imaging All publications, patents, and patent applications men device to track the target so as to maintain predetermined tioned in this specification are herein incorporated by refer position and/or size of the target within one or more images ence to the same extent as if each individual publication, captured by the imaging device. The tracking of the target patent, or patent application was specifically and individually may be performed while the UAV is controlled to navigate indicated to be incorporated by reference. 60 according user commands and/or predetermined navigation paths. The control terminal may be configured to display BRIEF DESCRIPTION OF THE DRAWINGS images from the imaging device as well as allowing user input related to the target information. The novel features of the invention are set forth with par Using the tracking methods and systems provided herein, a ticularity in the appended claims. A better understanding of 65 single user can control both the navigation of a UAV and the features and advantages of the present invention will be tracking of a target Substantially concurrently without the obtained by reference to the following detailed description help of an extra person. The user can utilize a user interface of US 9,164,506 B1 11 12 a control terminal to specify the target to track and/or the type to the movable object 101. For example, the carrier 102 that of target to track. Such user-specified target information may connects the movable object 101 and the payload 104 may not be transmitted to the UAV, which can autonomously track the permit the payload 104 to move relative to the movable object target, for example, using an imaging device onboard the 101. Alternatively, the payload 104 may be coupled directly UAV. Images captured by the imaging device (e.g., pictures to the movable object 101 without requiring a carrier. and/or videos) can be transmitted in real time to the control In some embodiments, the payload 104 can include one or terminal for display, playback, storage, or other purposes. The more sensors for Surveying or tracking one or more targets user may also change or adjust the target to track in real time 116. Examples of Such a payload may include an image using the control terminal. Advantageously, such autono capturing device or imaging device (e.g., camera or cam mous tracking can be performed by the UAV while the user 10 corder, infrared imaging device, ultraviolet imaging device, engages in other activities such as controlling the navigation or the like), an audio capture device (e.g., a parabolic micro of the UAV, or other activities. phone), an infrared imaging device, or the like. Any Suitable For instance, a user can configure the UAV to track herself sensor(s) can be incorporated into the payload 104 to capture as she engages in a variety of activities such as hiking or any visual, audio, electromagnetic, or any other desirable biking. She may specify herself as the tracking target for the 15 signals. The sensors can provide static sensing data (e.g., a UAV using the user interface provided by the control termi photograph) or dynamic sensing data (e.g., a video). The nal. For example, she may select herself as the target from an sensors may capture sensing data continuously in real time or image displayed on a user interface of the control terminal, at high frequencies. for example, using a touchscreen. Once the target information In various embodiments, the target 116 being tracked the is transmitted to the UAV, the user can be relieved from the movable object 101 can include any natural or man-made low-level operations associated with manual tracking of a objects or structures such geographical landscapes (e.g., target Such as adjusting the UAV, carrier or imaging device. mountains, vegetation, Valleys, lakes, or rivers), buildings, Instead, she can focus on other activities such as biking while vehicles (e.g., aircrafts, ships, cars, trucks, buses, vans, or the UAV automatically tracks in real time her based on the motorcycle). The target 116 can also include live subjects provided target information using the methods provided 25 Such as people or animals. The target 116 may be moving or herein. For instance, the attitude, position, Velocity, Zoom, stationary relative to any suitable reference frame. The refer and other aspects of the UAV and/or the imaging device can ence frame can be a relatively fixed reference frame (e.g., the be automatically adjusted to ensure that the user maintains a Surrounding environment, or earth). Alternatively, the refer designated position and/or size within the images captured by ence frame can be a moving reference frame (e.g., a moving the imaging device. Images captured during the tracking pro 30 vehicle). In various embodiments, the target 116 may include cess (e.g., videos or pictures) may be streamed to the control a passive target or an active target. An active target may be terminal in real time or substantially real time for display, configured to transmit information about the target, such as playback, storage, or other purposes. All of the above can be the target's GPS location, to the movable object. Information achieved by one person in a relatively painless manner, mak may be transmitted to the movable object via wireless com ing it easier for users achieve previously difficult-to-achieve 35 munication from a communication unit of the active target to tasks. a communication unit of the movable object. Examples of an Such tracking methods and systems as described herein active target can include a friendly vehicle, building, troop, or advantageously facilitate the automation of the low-level con the like. A passive target is not configured to transmit infor trol portion of the tracking process so as to reduce the efforts mation about the target. Examples of a passive target can required and the errors resulting from manual tracking. At the 40 include a neutral or hostile vehicle, building, troop, and the same time, the tracking methods and system described herein like. still allows users to maintain, if desired, high-level control of The movable object 101 can be configured to receive, and the tracking process (e.g., by specifying the type of target to the control terminal 112 can be configured to provide control track). data. The control data can be used to control, directly or FIG. 1 illustrates an exemplary target tracking system 100, 45 indirectly, aspects of the movable object 101. In some in accordance with embodiments. The system 100 includes a embodiments, the control data can include navigation com movable object 101 and a control terminal 112. The system mands for controlling navigational parameters of the movable 100 may be used to track one or more targets 116. Although object such as the position, speed, orientation, or attitude of the movable object 101 is depicted as an unmanned aerial the movable object 101. The control data can be used to vehicle (UAV), this depiction is not intended to be limiting, 50 control flight of a UAV. The control data may affect operation and any suitable type of movable object can be used, as of one or more propulsion units 106 that may affect the flight described herein. One of skill in the art would appreciate that of the UAV. In other cases, the control data can include com any of the embodiments described herein in the context of mands for controlling individual components of the movable aircraft systems can be applied to any suitable movable object 101. For instance, the control data may include infor object. 55 mation for controlling the operations of the carrier 102. For In some embodiments, the movable object 101 can include example, the control data may be used to control an actuation a carrier 102 and a payload 104. The carrier 102 may permit mechanism of the carrier 102 so as to cause angular and/or the payload 104 to move relative to the movable object 101. linear movement of the payload 104 relative to the movable For instance, the carrier 102 may permit the payload 104 to object 101. As another example, the control data may be used rotate around one, two, three, or more axes. Alternatively or 60 to control the movement of the carrier 102 without the pay additionally, the carrier 102 may permit the payload 104 to load. As another example, the control data may be used to move linearly along one, two, three, or more axes. The axes adjust one or more operational parameters for the payload for the rotational or translational movement may or may not 104 Such as taking still or moving pictures, Zooming in or out, be orthogonal to each other. turning on or off, Switching imaging modes, change image In some embodiments, the payload 104 may be rigidly 65 resolution, changing focus, changing depth offield, changing coupled to or connected with the movable object 101 such exposure time, changing speed of lens, changing viewing that the payload 104 remains substantially stationary relative angle or field of view, or the like. In other embodiments, the US 9,164,506 B1 13 14 control data may be used to control a sensing system (not Once a target is identified, expected target information can show), communication system (not shown), and the like, of be used to detect a deviation from expected characteristics of the movable object 101. the target Such as expected position and/or size. In some In some embodiments, the control data from the control embodiments, current target characteristics or information terminal 112 can include target information. In some cases, can be determined based on one or more images captured by the target information can include characteristics of a specific the movable object. The current target information can be target Such as an initial position (e.g., coordinates) and/or size compared with the expected target information provided by of a target within one or more images captured by an imaging the control terminal to determine the deviation therefrom. A device carried by the movable object 101. Additionally or change in position of the target may be detected by comparing alternatively, the target information can include target type 10 coordinates of the target (e.g., the coordinates of a center information Such as characteristics of a type or category of point of the target) within an image to the coordinates of the targets including color, texture, pattern, size, shape, dimen expected target position. A change in size of the target may be Sion, and the like. Target information can include data repre detected by comparing the size of the area (e.g., in pixels) sentation of an image of the target. This may include an image 15 covered by the target with the expected target size. In some of the target in a field of view. Field of view may be defined or embodiments, a change in size may be detected by detecting encompassed by the images captured by the imaging device. an orientation, boundaries, or other characteristics of the tar Target information can also include expected target infor get. mation. The expected target information specifies the charac Based at least in part on the detected deviation, control teristics that the target being tracked is expected to meet in the signals may be generated (e.g., by one or more processors images captured by the imaging device. The expected target onboard the movable object) that cause adjustment that sub information may be used to adjust the movable object, carrier stantially corrects the detected deviation. As such, the adjust and/or imaging device so that the target being tracked main ment may be used to Substantially maintain one or more tains an appearance in one or more images according to the expected target characteristics (e.g., target position and/or expected target information. For example, the target may be 25 size) within the images captured by the movable object. In tracked so as to maintain an expected position and/or size Some embodiments, the adjustment may be performed in within one or more images captured by the imaging device. Substantially real time as the movable object is executing For example, the expected position of the tracked target may user-provided navigation commands (e.g., hovering or mov be near the center of the image or off-center. The expected ing) and/or predetermined navigation paths. The adjustment size of the tracked target may be around a certain number of 30 may also be performed in Substantially real time as the imag pixels. The expected target information may or may not be the ing device is capturing one or more images. In some embodi same as the initial target information. In various embodi ments, the adjustment may be generated based on other infor ments, expected target information may or may not be pro mation Such as sensing data acquired by one or more sensors vided by the control terminal. For example, expected target onboard the movable object (e.g., proximity sensor, or GPS information may be hardcoded in the control logic executed 35 sensor). For example, position information of the target being by a processing unit onboard the movable object, stored in a tracked may be obtained by a proximity sensor and/or pro data store local and/or remote to the movable object, or vided by the target itself (e.g., GPS location). Such position obtained from other suitable sources. information may be used, in addition to the detected devia In some embodiments, the target information (including tion, to generate the adjustment. specific target information and target type information) may 40 The adjustment may pertain to the movable object, the be generated at least in part on user input at the control carrier, and/or the payload (e.g., imaging device). For terminal 112. Additionally or alternatively, the target infor example, the adjustment may cause the movable object and/or mation may be generated based on data from other sources. the payload (e.g., imaging device) to change its position, For example, target type information may be derived based on attitude, orientation, angular and/or linear Velocity, angular previous images and/or data extracted from local or remote 45 and/or linear Velocity, and the like. The adjustment may cause data stores. The images could have been previously captured the carrier to move the payload (e.g., imaging device) relative by the imaging device 104 coupled to the movable object 101 to the movable object such as around or along one, two, three, or other devices. The images could be computer-generated. or more axes. Furthermore, the adjustment may include Such target type information may be selected by the user adjustment to the Zoom, focus, or other operational param and/or provided automatically by default to the movable 50 eters of the payload (e.g., imaging device) itself (e.g., Zoom object. in/out). The target information may be used by the movable object In some embodiments, the adjustment may be generated 101 to track one or more targets 116. The tracking and any based at least in part on the type of detected deviation. For other related data processing may be performed at least in part example, a deviation from the expected target position may by one or more processors onboard the movable object 101. In 55 require rotation of the movable object and/or the payload Some embodiments, the target information can be used to (e.g., via the carrier) around one, two, or three rotational axes. identify, by the movable object, the target 116 to be tracked. As another example, a deviation from the expected target size Such identification of the target may be performed based on may require translational movement of the movable object the initial target information including the specific character along a Suitable axis and/or changes to the Zoom of the imag istics of a particular target (e.g., initial coordinates of the 60 ing device (e.g., Zoom in or out). For example, if the current or target within an image captured by the movable object), or actual target size is Smaller than the expected target size, the general characteristics of a type of target (e.g., color and/or movable object may need to be moved closer to the target texture of the target(s) to be tracked). In some cases, target and/or the imaging device may need to be Zoomed in to the identification can involve any suitable image recognition and/ target. On the other hand, if the current or actual target size is or matching algorithms. In some embodiments, target iden 65 larger than the expected target size, the movable object may tification includes comparing two or more images to deter need to be moved farther away from the target and/or the mine, extract, and/or match features contained therein. imaging device may need to be Zoomed out from the target. US 9,164,506 B1 15 16 In various embodiments, the adjustment to Substantially angular speed that is allowed for the movable object, the correct the deviation from expected target information may carrier, and/or the payload (e.g., imaging device). As another be achieved by controlling one or more controllable objects example, the linear speed of the movable object and/or the Such as the movable object, the carrier, the imaging device, or carrier may be capped by a maximum linear speed that is any combination thereof via control signals. In some embodi allowed for the movable object, the carrier, and/or the payload ments, the controllable objects may be selected to implement (e.g., imaging device). As yet another example, adjustment to an adjustment and the corresponding control signals may be the focal length of the imaging device may be limited by the generated based at least in part on the configurations or set maximum and/or minimum focal length for the particular tings of the controllable objects. For example, an adjustment imaging device. In some embodiments, such limits may be that involves rotation around two axes (e.g., yaw and pitch) 10 may be achieved solely by corresponding rotation of the predetermined and depend on the particular configuration of movable object around the two axes if the imaging device is the movable object, the carrier, and/or the payload (e.g., rigidly coupled to the movable object and hence not permitted imaging device). In some instances, such configurations may to move relative to the movable object. Such may be the case be configurable (e.g., by a manufacturer, administrator, or when the imaging device is directly coupled to the movable 15 user). object, or when the imaging device is coupled to the movable In some embodiments, the movable object 101 can be object via a carrier that does not permit relative movement configured to provide and the control terminal 112 can be between the imaging device and the movable object. The configured to receive data such as sensing data acquired by same two-axis adjustment may be achieved by combining sensors onboard the movable object 101, and tracking data or adjustment to both the movable object and the carrier if the information used to indicate characteristics of one or more carrier permits the imaging device to rotate around at least target tracked by the movable object 101. Examples of sens one axis relative to the movable object. In this case, the carrier ing data may include image data acquired by an imaging can be controlled to implement the rotation around one or two device carried by the movable object 101 or other data of the two axes required for the adjustment and the movable acquired by other sensors. For example, real-time or nearly object can be controlled to implement the rotation around one 25 real-time video can be streamed from the movable object 101 or two of the two axes. For example, the carrier may include and/or the payload 104 (e.g., imaging device) to the control a one-axis gimbal that allows the imaging device to rotate terminal 112. The sensing data may also include data around one of the two axes required for adjustment while the acquired by global positioning system (GPS) sensors, motion rotationaround the remaining axis is achieved by the movable sensors, inertial sensors, proximity sensors, or other sensors. object. Alternatively, the same two-axis adjustment may be 30 Examples of tracking information may include relative or achieved by the carrier alone if the carrier permits the imaging absolute coordinates and/or size of the target within one or device to rotate around two or more axes relative to the more image frames received from the movable object, movable object. For instance, the carrier may include a two changes of the target between consecutive image frames, GPS axis or three-axis gimbal. coordinates, or other positional information of the target, and As another example, an adjustment to correct a change in 35 the like. In some embodiments, the tracking information may size of the target may be achieved by controlling the Zoom be used by the control terminal 112 to display the target as in/out of the imaging device (e.g., if the imaging device being tracked (e.g., via a graphical tracking indicator Such as Supports the Zoom level required), by controlling the move a box around the target). In various embodiments, the data ment of the movable object (e.g., so as to get closer to or received by the control terminal may include raw data (e.g., farther away from the target), or by a combination of Zoom 40 raw sensing data as acquired by the sensors) and/or processed in/out of the imaging device and the movement of the mov data (e.g., tracking information as processed by one or more able object. A processor onboard the movable object may processors on the movable object). make the determination as to which object or combination of In some embodiments, the control terminal 112 can be objects to adjust. For example, if the imaging device does not located at a location distant or remote from the movable Supporta Zoom level required to maintain the required size of 45 object 101, carrier 102, and/or payload 104. The control ter the target within an image, the movable object may be con minal 112 can be disposed on or affixed to a Support platform. trolled to move instead of or in addition to adjusting the Zoom Alternatively, the control terminal 112 can be a handheld or of the imaging device. wearable device. For example, the control terminal 112 can In Some embodiments, the adjustment may be imple include a Smartphone, tablet, laptop, computer, glasses, mented by taking into account other constraints. For example, 50 gloves, helmet, microphone, or Suitable combinations in cases where the navigation path of the movable object is thereof. predetermined, the adjustment may be implemented by the The control terminal 112 can be configured to display data carrier and/or imaging device without affecting the move received from the movable object 101 via a display. The ment of the movable object. The navigation path of the mov displayed data may include sensing data Such as images (e.g., able object may be predetermined, for example, if a remote 55 still images and videos) acquired by an imaging device car user is actively controlling the navigation of the movable ried by the movable object 101. The displayed data may also object via a control terminal or if the movable object is navi include tracking information that is displayed separately from gating (e.g., autonomously or semi-autonomously) according the image data or Superimposed on top of the image data. For to a pre-stored navigation path. example, the display may be configured to display the images Examples of other constraints may include maximum and/ 60 where the target is indicated or highlighted with a tracking or minimum limit for rotation angles, angular and/or linear indicator Such as a box, circle, or any other geometric shape speed, operational parameters, and the like for the movable Surrounding the target being tracked. In some embodiments, object, the carrier, and/or the payload (e.g., imaging device). the images and the tracking indicator are displayed in Sub Such maximum and/or minimum threshold values may be stantially real-time as the image data and tracking informa used to limit the range of the adjustment. For example, the 65 tion are received from the movable object and/or as the image angular speed of the movable object and/or the imaging data is acquired. In other embodiments, the display may be device around a certain axis may be capped by a maximum provided after some delay. US 9,164,506 B1 17 18 The control terminal 112 can be configured to receive user object, speed or altitude of the movable object, user prefer input via an input device. The input device may include a ences, and computing resources (e.g., CPU or memory) avail joystick, keyboard, mouse, stylus, microphone, image or able onboard and/or off-board the movable object, the like. motion sensor, inertial sensor, and the like. Any suitable user For example, relatively more control may be allocated to the input can be used to interact with the terminal. Such as manu user when the movable object is navigating in a relatively ally entered commands, Voice control, gesture control, or complex environment (e.g., with numerous buildings or position control (e.g., via a movement, location or tilt of the obstacles or indoor) than when the movable object is navigat terminal). For instance, the control terminal 112 may be con ing in a relatively simple environment (e.g., wide open space figured to allow a user to control a state of the movable object, or outdoor). As another example, more control may be allo carrier, payload, or any component thereof by manipulating a 10 cated to the user when the movable object is at a lower altitude joystick, changing an orientation or attitude of the control than when the movable object is at a higher altitude. As yet terminal, interacting with a graphical user interface using a another example, more control may be allocated to the mov keyboard, mouse, finger, or stylus, or by using any other able object if the movable object is equipped with a high suitable methods. speed processor adapted to perform complex computations The control terminal 112 may also be configured to allow a 15 relatively quickly. In some embodiments, the allocation of user to enter target information using any suitable method. In control over the tracking process between user and movable Some embodiments, the control terminal 112 may be config object may be dynamically adjusted based on the factors ured to enable a user to directly select a target from one or described herein. more images (e.g., video or Snapshot) that is being displayed. The user input may be used, at least in part, to generate For example, the user may select a target by directly touching control data such as described herein. The control data may be the screen using a finger or stylus or selection using a mouse generated by the control terminal, the movable object, a third or joystick. The user may draw around the target, touch the device, or any combination thereof. For instance, the user's target in the image, or otherwise select the target. Computer manipulation of a joystick or the control terminal or an inter vision or other techniques may be used to determine bound action with a graphical user interface may be translated into ary of target. Otherwise, user input may define the boundary 25 predetermined control commands for changing a state or of target. One or more targets may be selected at a time. In parameter of the movable object, carrier, or payload. As Some embodiments, the selected target is displayed with a another example, a user's selection of a target within an image selection indicator to indicate that the user has selected the being displayed by the control terminal may be used togen target for tracking. In some other embodiments, the control erate initial and/or expected target information for tracking terminal may be configured to allow a user to enter or select 30 purposes such as an initial and/or expected position and/or target type information Such as color, texture, shape, dimen size of the target. Alternatively or additionally, the control sion, or other characteristics associated with a desired target. data may be generated based on information obtained from For example, the user may type in the target type information, non-user Sources such as a remote or local data store, other select such information using a graphical user interface, or computing devices operatively connected to the control ter use any other Suitable methods. In some other embodiments, 35 minal, or the like. the target information may be obtained from Sources other FIG. 2 illustrates exemplary flow 200 of image-related data than the user Such as a remote or local data store, other among components in a tracking system, in accordance with computing devices operatively connected to or otherwise in embodiments. In some embodiments, image-related data communication with the control terminal, or the like. includes raw or processed image data as well as data extracted In some embodiments, the control terminal allows a user to 40 or derived from the image data Such as tracking information select between a manual tracking mode and an automatic of targets. As illustrated, raw image data can be captured by an tracking mode. When the manual tracking mode is selected, a image sensor 204. The image sensor 204 may be configured user can specify a specific target to track. For example, the to convert optical signals into electronic signals. The image user can manually selects a target from an image being dis sensor 204 may include semiconductor charge-coupled played by the control terminal. The specific target informa 45 devices (CCD), active pixel sensors using complementary tion associated with the selected target (e.g., coordinates and/ metal-oxide-semiconductor (CMOS) or N-type metal-oxide or size) is then provided to the movable object as initial target semiconductor (NMOS, Live MOS) technologies, or any information of the target. On the other hand, when the auto other types of sensors. The image sensor may be coupled to a matic tracking mode is selected, the user does not specify a movable object Such as a UAV. For example, the image sensor specific target to be tracked. Rather, the user can specify 50 may be part of an imaging device (e.g., camera) that is carried descriptive information about the type of target to be tracked, by a UAV with or without a carrier. The image sensor and/or for example, via a user interface provided by the control imaging device may be configured to capture pictures, videos, terminal. The movable object can then use the initial target or any other image data with any suitable parameters such as information of a specific target or target type information to width, height, aspect ratio, megapixel count, resolution or automatically identify the target to be tracked and Subse 55 quality, and the like. For example, the imaging device may be quently track the identified target. configured to capture high-definition or ultra-high-definition In general, providing specific target information (e.g., ini videos (e.g., 720p, 1080i, 1080p, 1440p, 2000p, 2160p. tial target information) requires more user control of the 2540p. 4000p, 4320p, and so on). tracking of the target and less automated processing or com The image data captured by the image sensor 204 can be putation (e.g., image or target recognition) by a processing 60 stored in a device 202. The data storage device system onboard the movable object. On the other hand, pro 202 may be based on semiconductor, magnetic, optical, or viding target type information requires less user control of the any Suitable technologies and may include , tracking process but more computation performed by the USB drives, memory cards, solid-state drives (SSDs), hard onboard processing system. The appropriate allocation of the disk drives (HDDS), floppy disks, optical disks, magnetic control over the tracking process between the user and the 65 tapes, and the like. For example, the data storage device 202 onboard processing system may be adjusted depending on a can include removable storage devices that are detachably variety of factors such as the Surroundings of the movable couplable to an imaging device Such as memory cards of any US 9,164,506 B1 19 20 suitable formats such as PC Card, CompactFlash, SmartMe stations, such as towers, satellites, or mobile stations, can be dia, , Memory Stick Duo, Memory Stick PRO used. Wireless communications can be proximity dependent Duo, , Multimedia Card (MMC), Reduced or proximity independent. In some embodiments, line-of Size Multimedia Card (RS-MMC), MMCmicro Card (MMC sight may or may not be required for communications. In micro), PS2 card, Secure Digital (SD) card, SXS, Universal addition to tracking information, the communication module Flash Storage (UFS), miniSD, microSD, XD-Picture Card, can also be configured to transmit and/or receive sensing data Intelligent Stick (iStick), Serial Flash Module (SFM), NT from other sensors onboard the movable object, positional Card, XQD card, and the like. The data storage device 202 can and/or motion information determined by processing the also include external hard disk drives, optical drives, tape sensing data, predetermined control data, user commands drives, floppy drives, and other Suitable storage devices that 10 from remote control terminals, and the like. may be operatively connected to the imaging device. In some embodiments, the image data, as provided by the The image data captured by the image sensor 204 can be image transmission module 206, may be augmented by or transmitted to the control terminal 212 by an image transmis otherwise combined with the tracking information, as pro sion module 206. In some embodiments, the image data may duced by the processing unit 210, to show a target with a be compressed or otherwise processed before being transmit 15 tracking indicator (e.g., a circle or box around the target). A ted by the image transmission module 206. In other cases, the user may view the augmented image data to see the target as image data may not be compressed or processed before being it is being tracked by the imaging device. The user may also transmitted. The transmitted image data may be displayed on interact with the control terminal based on the augmented the control terminal 212 so that a user operating the control image data. For example, the user may select a different target terminal 212 can view the image data and/or interact with the to track from the augmented image data (e.g., by touching an control terminal 212 based on the image data. area of the screen corresponding to the different target). The image data captured by the image sensor 204 can be In some embodiments, the tracking information can be pre-processed by a pre-processing unit 208. The pre-process provided to the control terminal 212 in response to a demand ing unit 208 can include any hardware, Software, or a combi by the control terminal 212. For example, the control terminal nation thereof. Examples of pre-processing unit 208 can 25 212 may demand Such tracking information only when a user include a field programmable gate array (FPGA). The pre elects to a certain viewing mode (e.g., a preview mode) where processing unit 208 can be operatively coupled to the image the target being tracked is highlighted or otherwise indicated. sensor 204 to pre-processing of the raw image data before the Alternatively or additionally, the tracking information may be image data is processed to extract specific piece of informa provided to the control terminal 212 without any demand by tion. Examples of tasks performed by the pre-processing unit 30 the control terminal 212. For example, the tracking informa 208 can include re-sampling to assure the correctness of the tion may be pushed to the control terminal 212 on a periodic image coordinate system, noise reduction, contrast enhance basis (e.g., every 0.1 second, 0.2 second, 0.5 second, 1 sec ment, Scale space representation, and the like. ond, or 2 second). The image data, as processed by the pre-processing unit FIG. 3 illustrates exemplary flow 300 of control data 208, can be further processed by a processing unit 210 that is 35 among components in a tracking system, in accordance with operatively coupled to the pre-processing unit 208 (e.g., via a embodiments. As discussed above, control data can include general purpose memory controller (GPMC) connection). target information used by a movable object (e.g., a UAV) to The processing unit 210 can include one or more ARM pro track a target, as well as other data for controlling various cessors. The processing unit 210 can be configured to perform aspects of the movable object or a component thereof. The any suitable embodiments of the methods described herein. 40 control data can be generated by a user 302 interacting with a Examples of tasks performed by the processing unit 210 may control terminal 304. The generated control data can include include feature extraction at any suitable level of complexity, specific target information (e.g., initial target information) image segmentation, data Verification, image recognition, 306 and target type information 308. image registration, image matching, and the like. In some Specific target information 306 can include characteristics embodiments, the processing unit 210 produces tracking 45 about a specific target Such as coordinates (e.g., pixel coor information related to a target that is being tracked by the dinates), size, and the like. In some embodiments, specific movable object. The tracking information may be generated target information 306 can be generated whenauser selects or based on image processing by a processing unit of the mov specifies a specific target to track via a user interface provided able object and/or based on target information as provided by by the control terminal 304. For example, the control terminal the control terminal. The tracking information may include, 50 may allow a user to select between a manual tracking mode for example, the location, size, or other characteristics of a and an automatic tracking mode. When the manual tracking target within one or more images. mode is selected, a user can specify a specific target to track The tracking information as determined by the processing (e.g., by selecting a target from one or more images being unit 210 can be provided to the control terminal 212 via a displayed). Based on the user selection, specific target infor communication module (not shown). In some embodiments, 55 mation may be generated. the communication module may be separate from the image Target type information 308 can include information transmission module 206 described above. In other embodi describing a type of targets to be tracked rather than informa ments, the communication module may include or be tion about a specific target. Such target type information may included in the image transmission module 206. Any Suitable include various target characteristics such as color, texture, means of communication can be used for the communication 60 pattern, size, shape, dimension, and the like. In some embodi module and/or the image transmission module 206. Such as ments, the target information (including specific target infor wired communication or wireless communication. For mation and target type information) may be generated at least example, the communication module and/or the image trans in part on user input at the control terminal 304. For example, mission module 206 can utilize one or more of local area the control terminal may allow a user to select between a networks (LAN), wide area networks (WAN), infrared, radio, 65 manual tracking mode and an automatic tracking mode. WiFi, point-to-point (P2P) networks, telecommunication net When the automatic tracking mode is selected, a user can works, cloud communication, and the like. Optionally, relay enter or select target type information. Additionally or alter US 9,164,506 B1 21 22 natively, the target information may be generated based on movable object while the processing system 310 generates data from other sources. For example, target type information the control commands for the movable object and/or the may be derived based on previous images and/or extracted imaging device. This may allow a user to focus on controlling from local or remote data stores. In some embodiments, pre the navigation of the movable object without having to worry defined target types or patterns may be presented to the user about tracking the target, which is performed automatically for selection. In some other embodiments, the predefined by the processing system 310. target types or patterns may be provided automatically by In various embodiments, the allocation of tracking control default to the movable object without user intervention. between the user and the automatic tracking system can vary Target information can optionally include expected target depending on a variety of factors such as the Surroundings of information Such as described herein. The expected target 10 the movable object, speed or altitude of the movable object, information may or may not overlap with the initial target user preferences, and computing resources (e.g., CPU or information. memory) available onboard and/or off-board the movable The target information (including specific target informa object, the like. For example, relatively more control may be tion 306 and target type information 308) can be provided to allocated to the user when the movable object is navigating in a processing system 310, for example, via a communication 15 a relatively complex environment (e.g., with numerous build module (not shown). The processing system 310 may be ings or obstacles or indoor) than when the movable object is onboard amovable object (e.g., UAV). The processing system navigating in a relatively simple environment (e.g., wide open 310 may include a data processing unit 312 and a command space or outdoor). As another example, more control may be control unit 314. The data processing unit 312 may be con allocated to the user when the movable object is at a lower figured to perform any embodiments of the methods altitude than when the movable object is at a higher altitude. described herein. For instance, the data processing unit 312 As yet another example, more control may be allocated to the can be configured to identify target based on target informa movable object if the movable object is equipped with a tion (e.g., including specific target information 306 and/or high-speed processor adapted to perform complex computa target type information 308), determine deviation from the tions relatively quickly. In some embodiments, the allocation target information, and the like. The data processing unit 312 25 of control over the tracking process between user and mov may include a pre-processing unit and/or a processing unit able object may be dynamically adjusted based on the factors Such as similar to the pre-processing unit 208 and processing described herein. unit 210 respectively described in FIG. 2. For example, the According to an aspect of the present invention, methods data processing unit 312 may include an FPGA and/or one or and systems are provided for tracking a target by an imaging more ARM processors. 30 device coupled to a movable object so as to Substantially The data processing unit 312 may be operatively coupled to maintain an expected position and/or size of the target within a command control module 314 configured to control a state one or more images captured by the imaging device. of the movable object. The command control module 314 FIG. 4 illustrates an exemplary process 400 for implement may be configured to performany embodiments of the meth ing target tracking, inaccordance with embodiments. Aspects ods described herein. For instance, the command control 35 of the process 400 may be performed by one or more proces module 314 can be configured to generate control commands sors onboard and/or off-board a movable object as described or signals 316 for controlling a component of the movable herein such as a UAV. Some or all aspects of the process 400 object so as to Substantially track the target based on the (or any other processes described herein, or variations and/or results from the data processing unit 312. combinations thereof) may be performed under the control of The control commands 316 can include commands for the 40 one or more computer/control systems configured with propulsion mechanisms of the movable object to adjust the executable instructions and may be implemented as code spatial disposition, Velocity, and/or acceleration of the mov (e.g., executable instructions, one or more computer pro able object with respect to up to six degrees of freedom in grams or one or more applications) executing collectively on order to correct a detected deviation of the target with respect one or more processors, by hardware or combinations to its position and/or size in one or more of the images. The 45 thereof. The code may be stored on a computer-readable control commands 316 can also include commands for storage medium, for example, in the form of a computer adjusting the state of a carrier so as to adjust the spatial program comprising a plurality of instructions executable by disposition, Velocity, and/or acceleration of a payload (e.g., one or more processors. The computer-readable storage imaging device) carried by the movable object via the carrier. medium may be non-transitory. The order in which the opera The control commands 316 can also include commands for 50 tions are described is not intended to be construed as a limi adjusting the state of a carrier so as to adjust the spatial tation, and any number of the described operations may be disposition, Velocity, and/or acceleration of a payload (e.g., combined in any order and/or in parallel to implement the imaging device) carried by the movable object via the carrier. processes. The control commands 316 can also include commands for The process 400 includes obtaining 402 target information adjusting one or more operating parameters of the payload 55 for one or more targets. The target information can be Such as taking still or moving pictures, Zooming in or out, received from a control terminal such as described herein. turning on or off, Switching imaging modes, change image Additionally or alternatively, the target information may be resolution, changing focus, changing depth offield, changing obtained from a component (e.g., memory) onboard a mov exposure time, changing speed of lens, changing viewing able object (e.g., UAV) or a device remote from the movable angle or field of view, or the like. 60 object Such as another movable object (e.g., another UAV), a Alternatively or additionally, any of the above control com server, or the like. In some cases, the target information about mands can be provided directly from the control terminal 304 a target may be provided by the target itself. to the processing system 310. For example, the user may use In various embodiments, the target information may the control terminal 304 to control the Zoom in/out of the include initial target information specific to a particular imaging device while the processing system 310 generates 65 known target or target type information about potentially the control commands for the movable object and/or the car unknown target(s). Specific target information include coor rier. As another example, the user may directly control the dinates (e.g., pixel coordinates), size, location, and other US 9,164,506 B1 23 24 information about a target within one or more images. Spe Once the target information is received, the process 400 cific target information may be generated based on user inter includes identifying 404 a target based on the target informa action with existing image data such as described herein. For tion, e.g., based on target type information. Any Suitable example, specific target information may be generated when image recognition or identification techniques may be used a user selects a particular object as target from one or more including approaches based on CAD-like object models (e.g., images displayed to the user. The specific target information edge detection, primal sketch, Man, Mohan and Nevatia, may include the initial position and/or size of the target as the Lowe, or Olivier Faugeras), appearance-based methods (e.g., target is selected by a remote user from within one or more using edge matching, divide-and-conquer search, greyscale images. matching, gradient matching, historgrams of receptive field 10 responses, or large model bases), feature-based methods Target information may also include target type informa (e.g., using interpretation trees, hypothesizing and testing, tion Such as color, texture, dimension, size, location, and/or pose consistency, pose clustering, invariance, geometric any other characteristics about a type or group of potentially hashing, Scale-invariant feature transform (SIFT), or unknown or unidentified targets. Target type information may Speeded Up Robust Features (SURF)), genetic algorithms, be specifically entered by a user. Alternatively, the user may 15 and the like. select a pre-existing target pattern or type (e.g., a black object After the target has been identified, Subsequent images or a round object with a radius greater or less than a certain captured by the imaging device may be monitored to detect value). In some embodiments, the user may provide target 406 a deviation of the target from certain expected character type information by selecting one or more targets from within istics that the target should maintain Such as expected position one or more images. Features or characteristics of the selected and/or size. As discussed above, the expected target informa targets may then be extracted and/or generalized to produce tion may be supplied by a control terminal (e.g., based on user the target type information, which may be used to identify input), a configuration file or memory associated with the other targets with similar features or characteristics. In vari movable object, or from some other sources. The expected ous embodiments, such feature extraction may be performed target information may or may not be the same as the initial by a control terminal, a processing unit on the movable object, 25 target information. The deviation of the target may be or third device. achieved by comparing the target's respective position, size Target information (including specific target information and/or any other suitable characteristics in one or more and target type information) may optionally include expected images with expected characteristics. Any Suitable image characteristics at which the target, if identified, should main recognition or identification techniques such as discussed tain while the target is tracked. For example, the target infor 30 herein may be used. mation may include an expected position of the target as In some embodiments, the expected target information is expressed by absolute or relative coordinates within an considered substantially maintained when the detected devia image. The tracking system may be configured to track the tion falls within certain predefined tolerance values. In such target Such that the target is kept, within a predefined degree cases, the deviation may be considered negligible and no of tolerance, at Substantially the same expected position over 35 corrective adjustment is required. Only when the deviation time. Alternatively or additionally, the target information may exceeds the predefined tolerance value is corrective adjust include an expected size of the target (e.g., as expressed by a ment required. For example, when the current position of the number of pixels occupied by the target). The tracking system target is within a predetermined number of pixels from the may be configured to track the target Such that the target is expected coordinates, the deviation may be considered neg kept, within a predefined degree of tolerance, at Substantially 40 ligible and hence no corrective adjustment is required. Simi the same expected size. Such expected target information larly, when the current size of the target is within a predeter may be the same or different from the initial target informa mined number of pixels from the expected size, the deviation tion for a specific target. The initial target information is may be considered negligible. The predefined degree of tol typically used to identify a target. Once the target has been erance may be defined by System parameters, configured by identified, the expected target information may be used to 45 users operating the control terminal, or otherwise defined. detect any deviation from the expected target information so In order to correct the deviation and maintain Substantially that such deviation can be corrected. In some cases, the target the expected characteristics of the target, control signals or information may also include other values such as a time commands can be generated 408 for adjusting the movable value or expiration time indicating a period of time during carrier, and/or imaging device. In some embodiments, devia which the target should be tracked, if identified, a flag indi 50 tions in the position of the target can be corrected via adjust cating whether the target information includes specific target ment to the attitude of the movable object and/or the imaging information or target type information, and the like. device (via the carrier), such as discussed in further detail in In some embodiments, the expected target information at FIG. 6. For example, such adjustment may involve changing which the target is to maintain may be provided by the control angular Velocity of the movable object and/or imaging device terminal. For example, the expected target information may 55 around one or more rotational axes. Deviations in the size of be generated based on user input to the control terminal or the target can be corrected via adjustment to the position of based on a configuration file or data store local or remote to the movable object and/or to the operational parameters of the the control terminal. In some other embodiments, the imagining device, such as discussed in further detail in FIG. 7. expected target information may be provided by the movable For example, such adjustment may involve changing linear object. For example, the movable object may be configured, 60 velocity of the movable object along an axis. Alternatively or by default, to keep a target at Substantially the center of an additionally, the adjustment may involve changing the Zoom, image, or at around particular coordinates of the image. As focus, or other characteristics associated with the imaging another example, the movable object may be configured, by device. default, to keep the target as captured by the imaging device, In some embodiments, the adjustment may be limited by at a particular size. In yet some other embodiments, the 65 constraints imposed by System configuration, by the user expected target information may be provided by Some other operating a control terminal, or by other entities. Examples of objects or device external to the movable object. Such constraints may include maximum and/or minimum US 9,164,506 B1 25 26 limit for rotation angles, angular and/or linear speed, and the the movable object 502 around the X,Y, and Z axes can be like for the movable object (e.g., the propulsion system expressed as (), (), and (02, respectively. The movable thereof), the carrier (e.g., an actuation member thereof), the object 502 may also be capable of translational movements imaging device, or the like. Such threshold values may be 528,526, and 530 along the X,Y, and Z axes, respectively. used to limit the range of the adjustment. For example, an 5 The linear velocities of the movable object 502 along the X, adjustment involving the angular speed of the movable object Y, and Z axes can be expressed as V, V, and V2, and/or the imaging device relative to the movable object (via respectively. a carrier) around a certain axis may be capped by a maximum In the exemplary configuration, the payload 506 is coupled angular speed that is allowed for the movable object and/or to the movable object 502 via a carrier 504. The carrier 504 the carrier. As another example, an adjustment involving the 10 may be capable of causing the payload 506 to move relative to linear speed of the movable object and/or the imaging device the movable object around and/or along up to three orthogo relative to the movable object (via a carrier) may be capped by nal axes, X (pitch) 516, Y. (yaw) 514 and Z (roll) 518. The a maximum linear speed that is allowed for the movable X, Y, and Z axes may be respectively parallel to the X,Y, object and/or the carrier. In some embodiments, such limits and Z axes. In some embodiments, where the payload is an may be predetermined and depend on the particular configu 15 imaging device including an optical module 507, the roll axis ration of the movable object, carrier, and/or the imaging Z. 518 may be substantially parallel to an optical path or device. In some embodiments, the limits may be configurable optical axis for the optical module 507. The optical module (e.g., by a manufacturer, administrator, or user). 507 may be optically coupled to an image sensor such as In some embodiments, warning indicators may be pro described herein to capture images. The carrier 504 may vided when the adjustment is modified according to the con cause the carrier 506 to rotate around up to three orthogonal straints described above (e.g., when the angular speed of the axes, X (pitch) 516, Y. (yaw) 514 and Z (roll) 518 based on movable object and/or the carrier around a certain axis is control signals provided to actuators associated with the car capped by a maximum angular speed that is allowed for the rier such as electric motors. The rotations around the three movable object and/or the carrier). Examples of such warning axes can be referred to as the pitch rotation 534, yaw rotation indicators may include textual, audio (e.g., Siren or beeping 25 532, and roll rotation 536, respectively. The angular velocities Sound), visual (e.g., certain color of light or flashing light), of the payload 506 around the X, Y, and Z axes can be kinetic (e.g., vibration or movement), any other Suitable types expressed as (), (), and (02, respectively. The carrier 504 of signals. Such warning indicators may be provided directly may also cause the payload 506 to engage in translational by the movable object or a component thereof. Alternatively movements 540, 538, and 542, along the X, Y, and Z axes or additionally, warning indicators may be provided by the 30 relative to the movable object 502. The linear velocity of the control terminal (e.g., via the display). In the latter case, the payload 506 along the X,Y, and Z axes can be expressed as control terminal may provide the warning indicators based on V,V2, and V2, respectively. signals from the movable object. In some embodiments, the carrier 504 may only permit the In some embodiments, the adjustment may be performed in payload 506 to move around and/or along a subset of the three Substantially real time as the movable object is executing 35 axes X, Y, and Z, relative to the movable object 502. For user-provided navigation commands or a predetermined instance, the carrier 504 may only permit the payload 506 flight path, and/or as the imaging device is capturing one or rotate around X,Y,Z or any combination thereof, without more images. In some embodiments, the adjustment may be allowing the payload 506 to move along any of the axes. For generated based on other information Such as sensing data example, the carrier 504 may permit the payload 506 to rotate acquired by one or more sensors onboard the movable object 40 only around one of the X, Y, and Z axes. The carrier 504 (e.g., proximity sensor, or GPS sensor). For example, position may permit the payload 506 to rotate only around two of the information of the target being tracked may be obtained by a X, Y, and Z axes. The carrier 504 may permit the payload proximity sensor and/or provided by the target itself (e.g., 506 to rotate around all three of the X, Y, and Z axes. GPS location). Such position information may be used, in In some other cases, the carrier 504 may only permit the addition to the detected deviation, to generate the adjustment 45 payload 506 move along X, Y, or Z axis, or any combina control signals to track the target such as described herein. tion thereof, without allowing the payload 506 to rotate In various embodiments, detection 406 of deviation from around any of the axes. For example, the carrier 504 may expected target information and/or generation 408 of control permit the payload 506 to move along only one of the X,Y, commands to correct the detected deviation may be repeated and Z axes. The carrier 504 may permit the payload 506 to for a predefined or indefinite period of time. In some embodi 50 move along only two of the X, Y, and Z axes. The carrier ments, such deviation detection and/or control command gen 504 may permit the payload 506 to move along only all three eration may be performed at certain intervals (e.g., every 0.01 of the X, Y, and Z axes. second, 0.1 second, 0.2 second, 0.5 second, or 1 second). In yet some other embodiments, the carrier 504 may allow FIG. 5 illustrates an exemplary configuration 500 of a the payload 506 perform both rotational and translational movable object, carrier, and payload, in accordance with 55 movement relative to the movable object. For example, the embodiments. The configuration 500 is used to illustrate carrier 504 may be configured to allow the payload 506 to exemplary types of adjustment to the movable object 502 move along and/or rotate around one, two, or three of the X and/or payload 506 that may be used to track a target. The Y2, and Z axes. movable object 502 and the payload 506 can include any In some other embodiments, the payload 506 may be embodiments discussed herein. For example, the movable 60 coupled to the movable object 502 directly without a carrier object 502 can include a UAV and the payload 506 can 504, or the carrier 504 may not permit the payload 506 to include an imaging device. move relative to the movable object 502. In such cases, the The movable object 502 may be capable of rotating around attitude, position and/or orientation of the payload 506 is up to three orthogonal axes, such as X (pitch) 510, Y (yaw) fixed relative to the movable object 502. 508 and Z (roll) 512 axes. The rotations around the three axes 65 In various embodiments, adjustment to attitude, orienta can be referred to as the pitch rotation 522, yaw rotation 520, tion, and/or position of the payload 506 may be achieved, and roll rotation 524, respectively. The angular velocities of collectively or individually, via suitable adjustment to the US 9,164,506 B1 27 28 movable object 502, the carrier 504, and/or the payload 506. no greater than Zero (C.s0). In some embodiments, C. can be For example, a rotation of 60 degrees around a given axis used to map a calculated pixel value to a corresponding con (e.g., yaw axis) for the payload may be achieved by a 60-de trol lever amount or sensitivity for controlling the angular gree rotation by the movable object alone, a 60-degree rota Velocity around a certain axis (e.g., yaw axis). In general, the tion by the payload relative to the movable object as effectu control lever may be used to control the angular or linear ated by the carrier, or a combination of 40-degree rotation by movement of a controllable object (e.g., UAV or carrier). the movable object and a 20-degree rotation by the payload Greater control lever amount corresponds to greater sensitiv relative to the movable object. ity and greater speed (for angular or linear movement). In Similarly, a translational movement for the payload may be Some embodiments, the control lever amount or a range achieved, collectively or individually, via adjustment to the 10 thereof may be determined by configuration parameters of the movable object 502 and the carrier 504. The desired adjust flight control system for a UAV or configuration parameters ment may, additionally or alternatively, be achieved by of a control system for a carrier. The upper and lower bounds adjustment to the operational parameters of the payload. Such of the range of the control lever amount may include any operational parameters of the payload may include, for arbitrary numbers. For example, the range of the control lever example, a Zoom in/out level or a focal length of an imaging 15 amount may be (1000, -1000) for one flight control system device. and (-1000, 1000) for another flight control system. FIG. 6 illustrates an exemplary tracking method for main For instance, assume that the images have a width of taining an expected position of a target, in accordance with W=1024 pixels and a height of H=768 pixels. Thus, the size embodiments. An exemplary image 600 is shown Such as of the images is 1024*768. Further assume that the expected captured by an imaging device carried by a movable object. position of the target has a uo 512. Thus, (u-uo)e(-512512). Assume that the image has a width of W pixels and a height of Assume that the range of the control lever amount around the H pixels (where W and Hare positive integers). A position yaw axis is (-1000, 1000), then the maximum control lever within the image can be defined by a pair of coordinates along amount or maximum sensitivity is 1000 and C=1000/512. a horizontal axis 601 (along the width of the image) and a Thus, the value of a can be affected by image resolution or vertical axis 603 (along the height of the image), where the 25 size provided by the imaging device, range of the control lever upper left corner of image has coordinates (0, 0) and the lower amount (e.g., around a certain rotation axis), the maximum right corner of the image has coordinates (W. H). control lever amount or maximum sensitivity, and/or other Assume that a target, as captured in the image 600, is factors. located at position P (u, v) 602, and the expected position of For instance, when the rotation is achieved by rotation of the target is Po (u, v) 604 that is different from P 602. In 30 the movable object, the Y axis 606 of FIG. 6 corresponds to Some embodiments, the expected position of the target Po (uo, the Y axis 508 for the movable object as illustrated in FIG.5 vo) may be near the center of the image, such that u=W/2. and the overall angular velocity of the field of view () is and/or vo-H/2. In other embodiment, the expected position of expressed as the angular velocity () for the movable object: the target may be located anywhere else within the image () (D-C*(u-tto), where ceR (2) (e.g., off-center). In various embodiments, the expected posi 35 tion of the target may or may not be the same as the initial In the equation (2), C. is a constant that is defined based on position of the target (e.g., as provided by the control termi the configuration of the movable object. In some embodi nal). Assuming that the current position P is deviated from the ments, C. is greater than Zero (CDO). The C. can be defined expected position Po Such that the deviation exceeds a prede similar to the a discussed above. For example, the value of C. termined threshold (such as expressed by a AX from up, and a 40 may be defined based on image resolution or size and/or range Ay from Vo), then an adjustment is required to bring the target of control lever amount for the movable object (e.g., around position from P to close to the expected position P. the yaw axis). In some embodiments, the deviation from the expected Similarly, when the rotation is achieved by the rotation of target position can be used to derive one or more angular the payload relative to the movable object (e.g., via the car velocities for rotating the field of view of the imaging device 45 rier), the Y axis 606 of FIG. 6 corresponds to the Y axis 514 around one or more axes. For example, deviation along the for the payload as illustrated in FIG. 5 and the overall angular horizontal axis 601 of the image (e.g., between u and u) may Velocity of the field of view () is expressed as the angular be used to derive an angular velocity () 610 for rotating the velocity () for the payload relative to the movable object: field of view of the imaging device around the Y (yaw) axis () (D-C*(u-tto), where ceR (3) 606, as follows: 50 In the equation (3), C is a constant that is defined based on co-C*(u-uo), where ceR (real numbers) (1) the configuration of the carrier and/or payload. In some The rotation around the Y axis for the field of view of an embodiments, C is greater than Zero (CO). The C can be imaging device may be achieved by a rotation of the movable defined similar to the a discussed above. For example, the object, a rotation of the payload (via a carrier) relative to the 55 value of C may be defined based on image resolution or size movable object, or a combination of both. In some embodi and/or range of control lever amount for the carrier (e.g., ments, adjustment to the payload may be selected when around the yaw axis). adjustment to the movable object is infeasible or otherwise In general, the angular velocity of the field of view around undesirable, for example, when the navigation path of the the Y (yaw) axis 606 can be expressed as a combination of the movable object is predetermined. In the equation (1), a is a 60 angular velocity () for the movable object and the angular constant that may be predefined and/or calibrated based on velocity co, for the payload relative to the movable object, the configuration of the movable object (e.g., when the rota Such as the following: tion is achieved by the movable object), the configuration of the carrier (e.g., when the rotation is achieved by the carrier), (4) or both (e.g., when the rotation is achieved by a combination 65 In the equation (4), either () or () may be Zero. of the movable object and the carrier). In some embodiments, As illustrated herein, the direction of the rotation around a is greater than Zero (CDO). In other embodiments, a may be theY (yaw) axis may depend on the sign of u-uo. For instance, US 9,164,506 B1 29 30 if the expected position is located to the right of the actual amount (e.g., around a certain rotation axis), the maximum position (as illustrated in FIG. 6), then u-up-0, and the field control lever amount or maximum sensitivity, and/or other of view needs to rotate in a counter-clockwise fashion around factors. the yaw axis 606 (e.g., pan left) in order to bring the target to For instance, when the rotation is achieved by rotation of the expected position. On the other hand, if the expected 5 the movable object, the X axis 608 of FIG. 6 corresponds to position is located to the left of the actual position, then the X axis 510 for the movable object as illustrated in FIG.5 u-up-0, and the field of view needs to rotate in a clockwise and the angular velocity of the field of view () is expressed as fashion around the yaw axis 606 (e.g., pan right) in order to the angular Velocity () for the movable object: bring the target to the expected position. 10 (ox-ox1-p, *(v-vo), where feR (6) As illustrated herein, the speed of rotation (e.g., absolute In the equation (6), f is a constant that is defined based on value of the angular Velocity) around a given axis (e.g., the Y the configuration of the movable object. In some embodi (yaw) axis) may depend on the distance between the expected ments, is greater than Zero (BDO). The can be defined similar and the actual position of the target along the axis (i.e., to the B discussed above. For example, the value of may be lu-uo). The further the distance is, the greater the speed of 15 defined based on image resolution or size and/or range of rotation. Likewise, the closer the distance is, the slower the control lever amount for the movable object (e.g., around the speed of rotation. When the expected position coincides with pitch axis). the position of the target along the axis (e.g., u ulo), then the Similarly, when the rotation is achieved by the rotation of speed of rotationaround the axis is Zero and the rotation stops. the payload relative to the movable object (e.g., via the car The method for adjusting the deviation from the expected rier), the X axis 608 of FIG. 6 corresponds to the X axis 516 target position and the actual target position along the hori for the payload as illustrated in FIG. 5 and the angular veloc Zontal axis 601, as discussed above, can be applied in a similar ity of the field of view () is expressed as the angular Velocity fashion to correct the deviation of the target along a different co, for the payload relative to the movable object: axis 603. For example, deviation along the vertical axis 603 of (ox-oxo-p, *(v-vo), where BeR (6) the image (e.g., between V and Vo) may be used to derive an 25 angular velocity Co. 612 for the field of view of the imaging In the equation (6), B is a constant that is defined based on the configuration of the carrier and/or payload. In some device around the X (pitch) axis 608, as follows: embodiments, B is greater than Zero (B-0). The B can be co-B(v-vo), where BeR (5) defined similar to the B discussed above. For example, the 30 value off may be defined based on image resolution or size The rotation around the X axis for the field of view of an and/or range of control lever amount for the movable object imaging device may be achieved by a rotation of the movable (e.g., around the pitch axis). object, a rotation of the payload (via a carrier) relative to the In general, the angular velocity of the field of view around movable object, or a combination of both. Hence, in the the X (pitch) axis 608 can be expressed as a combination of equation (5), B is a constant that may be predefined and/or 35 the angular Velocity () for the movable object and the angu calibrated based on the configuration of the movable object lar Velocity () for the payload relative to the movable object, (e.g., when the rotation is achieved by the movable object), Such as the following: the configuration of the carrier (e.g., when the rotation is achieved by the carrier), or both (e.g., when the rotation is (7) achieved by a combination of the movable object and the 40 In the equation (7), either () or co, may be zero. carrier). In some embodiments, B is greater than Zero (BDO). As illustrated herein, the direction of the rotation around In other embodiments, B may be no greater than Zero (Bs()). In the X (yaw) axis may depend on the sign of v-vo. For Some embodiments, B can be used to map a calculated pixel instance, if the expected position is located above of the actual value to a corresponding control lever amount for controlling position (as illustrated in FIG. 6), then v.-vo-0, and the field the angular Velocity around a certain axis (e.g., pitch axis). In 45 of view needs to rotate in a clockwise fashion around the pitch general, the control lever may be used to control the angular axis 608 (e.g., pitch down) in order to bring the target to the or linear movement of a controllable object (e.g., UAV or expected position. On the other hand, if the expected position carrier). Greater control lever amount corresponds to greater is located to below the actual position, then v-vo-0, and the sensitivity and greater speed (for angular or linear move field of view needs to rotate in a counter-clockwise fashion ment). In some embodiments, the control lever amount or a 50 around the pitch axis 608 (e.g., pitch up) in order to bring the range thereofmay be determined by configuration parameters target to the expected position. of the flight control system for a UAV or configuration param As illustrated herein, the speed of rotation (e.g., absolute eters of a carrier control system for a carrier. The upper and value of the angular velocity) depends on the distance lower bounds of the range of the control lever amount may between the expected and the actual position of the target (i.e., include any arbitrary numbers. For example, the range of the 55 v-vo) along a give axis (e.g., the X (pitch) axis). The further control lever amount may be (1000, -1000) for one control the distance is, the greater the speed of rotation. The closer the system (e.g., flight control system or carrier control system) distance is, the slower the speed of rotation. When the and (-1000, 1000) for another control system. expected position coincides with the position of the target For instance, assume that the images have a width of (e.g., v-vo), then the speed of rotation is Zero and the rotation W=1024 pixels and a height of H=768 pixels. Thus, the size 60 stops. of the images is 1024*768. Further assume that the expected In some embodiments, the values of the angular velocities position of the target has a vo-384. Thus, (v-vo)e(-384.384). as calculated above may be constrained or otherwise modified Assume that the range of the control lever amount around the by various constraints of the system. Such constraints may pitch axis is (-1000, 1000), then the maximum control lever include the maximum and/or minimum speed that may be amount or maximum sensitivity is 1000 and B=1000/384. 65 achieved by the movable object and/or the imaging device, Thus, the value of B can be affected by image resolution or the range of control lever amount or the maximum control size provided by the imaging device, range of the control lever lever amount or maximum sensitivity of the control system US 9,164,506 B1 31 32 for the movable object and/or the carrier, and the like. For actual sizes of the target is larger than the expected size S. example, the rotation speed may be the minimum of the then V<0 and the movable object moves away from the target calculated rotation speed and the maximum speed allowed. So as to reduce the size of the target as captured in the images. In some embodiments, warning indicators may be pro For instance, assume that the images have a width of vided when the calculated angular velocities need to be modi W=1024 pixels and a height of H=768 pixels. Thus, the size fied according to the constraints described herein. Examples of the images is 1024*768. Assume that the range of the of Such warning indicators may include textual, audio (e.g., control lever amount for controlling the linear velocity is siren or beeping sound), visual (e.g., certain color of light or (-1000, 1000). In an exemplary embodiment, Ö=-1000 when flashing light), mechanical, any other suitable types of sig S/S=3 and 6=1000 when S/S=/3. nals. Such warning indicators may be provided directly by the 10 In some embodiments, the values of the Velocities as cal movable object, carrier, payload, or a component thereof. culated above may be constrained or otherwise modified by Alternatively or additionally, warning indicators may be pro various constraints of the system. Such constraints may vided by the control terminal (e.g., via the display). In the include the maximum and/or minimum speed that may be latter case, the control terminal may provide the warning achieved by the movable object and/or the imaging device, indicators based on signals from the movable object. 15 the maximum sensitivity of the control system for the mov FIG. 7 illustrates an exemplary tracking method for main able object and/or the carrier, and the like. For example, the taining an expected size of a target, in accordance with speed for the movable object may be the minimum of the embodiments. An exemplary image 700 is shown such as calculated speed and the maximum speed allowed. captured by an imaging device carried by a movable object. Alternatively or additionally, the deviation between the Assume that a target 702 is captured by the image 700. The actual target size and the expected target size can be used to actual size of the target within the image can be Spixels (such derive adjustment to the operational parameters of the imag as calculated as the product of the width of the target and the ing device Such as a Zoom level or focal length in order to height of the target). The expected target size S may be correct the deviation. Such adjustment to the imaging device smaller (e.g., the expected target may be represented by 704 may be necessary when adjustment to the movable object is and S-so) or larger (e.g., the expected target may be repre 25 infeasible or otherwise undesirable, for example, when the sented by 705 and S-s) than the actual sizes. The expected navigation path of the movable object is predetermined. An size of the target may or may not be the same as the initial size exemplary focal length adjustment F can be expressed as: of the target (e.g., as provided by the control terminal). Assuming that the current sizes is deviated from the expected F=y* (1-s/S), where yeR (9) size so or S. Such that the deviation exceeds a predetermined 30 Where Y is a constant that is defined based on the configu threshold (such as a predefined AS pixels), then an adjustment ration of the imaging device. In some embodiments, Y is is required to bring the target size close to the expected size so greater than Zero (YDO). In other embodiments, Y is no greater or S. than Zero (ysO). The value of Y may be defined based on the Although display area of the image and target is shown as types of lenses and/or imaging devices. rectangles, it is for illustrative purposes only and not intended 35 If the actual sizes of the target is smaller than the expected to be limiting. Rather, the display area of the image and/or size S, then F-0 and the focal length increases by IFI so as to target may be of any Suitable shapes in various embodiments increase the size of the target as captured in the images. On the Such as circles, ovals, polygons, and the like. Likewise, other hand, if the actual sizes of the target is larger than the although the areas discussed herein are expressed in pixels, expected size S, then F<0 and the focallength decreases by IF these are for illustrative purposes only and not intended to be 40 So as to reduce the size of the target as captured in the images. limiting. In other embodiments, the areas may be expressed in For example, in an embodiment, Y=10. This means that, for any suitable units such as megapixels, mm, cm, inch, and example, when the actual size of the target is double the size the like. of the expected size S, the focallength should be decreased by In some embodiments, the deviation from the expected 10 mm accordingly (i.e., F=10* (1-2/1)=-10) and vice versa. target size can be used to derive one or more linear Velocities 45 In some embodiments, the adjustment to the operational for the movable object and/or imaging device along one or parameters of the imaging device such as focal length may be more axes. For example, deviation in the target size between constrained or otherwise modified by various constraints of actual target sizes and the expected target size S (e.g., S. So or the system. Such constraints may include, for example, the S) can be used to determine a linear Velocity V for moving the maximum and/or minimum focal lengths that may be movable object along a Z (roll) axis 710, as follows: 50 achieved by the imaging device. As an example, assume the focal length range is (20 mm, 58 mm). Further assume that the V=8* (1-s/S), where 8eR (8) initial focal length is 40 mm. Then whens-S, the focal length In the equation (8), 6 is a constant that is defined based on should be decreased according to equation (9); and whens.