Progress Report
Now that we have entered the advanced stages of our senior design
capstone project, we are full swing on data collection and experiment
implementation. Over the previous 7 weeks our focus was determining the
direction we want to take this project and designing experiments and
various scenarios to test the Loc8 program. It is our goal to determine
and prove that Loc8 is a faster method of locating a lost person or item
in a search and rescue type scenario then traditional methods. During
our preliminary research and testing we had difficulties demonstrating
the usefulness and speed of this program as it appeared using the more
traditional "squinting" method of looking for an item in a photograph
was faster. However, as we worked with Loc8 more and received proper
training, we began to get a sense for how we needed to use the program
to gain its greatest efficiency.
The goal this week was to run the experiments we had designed to officially determine whether or not Loc8 is a faster method of locating an object in an image. To do this, we performed a series of flights in 2 different locations and with 2 different platforms. The first flight we conducted was performed on 10/13/19 at Purdue Wildlife Area. This flight was done on a Mavic Pro at 120m(400ft) and covered a total area of 34 acres. The flight duration was 10:35 and there were 24 images collected at a resolution of 1.54in-px. The weather was 62 degrees with 12-18kt wind coming out of the south and clear skies. For this flight the pilot in command was Lucas Wright and the visual observer was Connor Yoder.
The next two flights were performed at Martell Forrest with the Yuneec H520 covered and area of 47 acres. The first one was performed at 120m(400ft) with a flight time of 9:29, 21 pictures taken and a ground sampling distance of 3.5cm/px. The second flight was performed at 70m with a duration of 9:58, 48 pictures were taken with a ground sampling distance of 2.1cm/px. The conditions for both flights were clear skies, 7kts of wind from 300 and a temperature of 57 degrees.
The goal this week was to run the experiments we had designed to officially determine whether or not Loc8 is a faster method of locating an object in an image. To do this, we performed a series of flights in 2 different locations and with 2 different platforms. The first flight we conducted was performed on 10/13/19 at Purdue Wildlife Area. This flight was done on a Mavic Pro at 120m(400ft) and covered a total area of 34 acres. The flight duration was 10:35 and there were 24 images collected at a resolution of 1.54in-px. The weather was 62 degrees with 12-18kt wind coming out of the south and clear skies. For this flight the pilot in command was Lucas Wright and the visual observer was Connor Yoder.
The next two flights were performed at Martell Forrest with the Yuneec H520 covered and area of 47 acres. The first one was performed at 120m(400ft) with a flight time of 9:29, 21 pictures taken and a ground sampling distance of 3.5cm/px. The second flight was performed at 70m with a duration of 9:58, 48 pictures were taken with a ground sampling distance of 2.1cm/px. The conditions for both flights were clear skies, 7kts of wind from 300 and a temperature of 57 degrees.
10/14/19 Metadata
Our next step was to compare the traditional squinting method against
the program Loc8. Squinting is the process of locating an object in an
image by directly looking at it. This involves zooming into the image
until about 1/4 of the whole image fills the screen and scanning that
quarter for the object. This process is repeated for the other 3/4 of
the image and then repeated on the next images until the object is
located. The squinters for this experiment were Luke Hull and Ryan
Riley. For this week we performed the squinting on the 10/13/19 flight
and the results are as follows.
This flight included 3 sets of items that needed to be found. The first
was a grey shirt with black shorts, the second was a blue shirt with
blue jeans, and the third was a white and blue striped towel. Once the
squinters began sorting through the images a timer was started and
continued until both squinters had determined they had looked at the
images long enough. Ryan stopped at 10:30 and Luke stopped at 10:59.
Ryan found the blue shirt and jeans at 9:07 along with Luke who found it
at the same time. The images below show the objects that the squinters
were looking for.
Image 1: SAR outfit
Image 2: SAR Shirt
Image 3: SAR towel
Image 4: SAR outfit
We noticed a glaring issue with Loc8 this week, that made
actually using the software near impossible in a realistic SAR way.
Essentially, Loc8 seems to be treating the color range values as a string of
three sets of two individual numbers, rather than as two strings of three
numbers. Figure 5 below illustrates this issue.
Image 5: Loc8 color range error, values of start and end colors are flipped when start value is higher than end value for each RGB range |
When we decide the range of colors we want Loc8 to search
for, we have a starting and ending color. This denotes the range to look at.
Shown in the red box labeled 1, the start color should be 0, 122, 133. Shown in
the box labeled 2, the end color should be 68, 0, 187. We were searching for a
blue/teal shirt, so we wanted to include a fairly encompassing range of colors
in this shirt. The box labeled 3 should list out the colors as two strings of
three; the start colors in RGB order, then the end colors in RGB order.
However, Loc8 seems to be listing the set of red, green, and blue in the order
of whichever is lower for that color as the start, and whichever is higher as
the end. Therefore, the box labeled 3 has inverted the green start and end
values. This changes the range being looked for drastically. The color range
essentially goes around the circle backwards, and includes almost the entirety
of the green range.
Obviously, this is a huge bug in the program, and has
hindered us fairly significantly for the time being. We have mentioned this
bug, and shown proof, to the creators of Loc8 and are waiting to hear back from
them. We hadn’t noticed this bug before because the color ranges we had used in
the past just so happened to have the start values all lower than the end
values, so the range was not altered.
Comments
Post a Comment