Meetings

10/22/2020 - Thursday w/ Dr. Man


  • Get spectrum analyzer - ask Dr. Lu

  • Clean camera lens (smudge) - bigger shutter size/aperture maybe help

  • All fixed camera needs to do is detect changes

  • Should be able to detect various element in image

  • Figure out range image detection works best

  • Simplest attack - jamming attack - put out white noise when you detect drone so controller cannot control drone

  • Another option is trick drone into thinking it is in no fly zone - maybe have it land or fly to home

  • Need to figure out how robust connection between controller and drone is

  • Start figuring out what for sure can be done, and then work from there to see what else is possible

  • May be able to sell this as a product if we can reach the point of being able to control the drone (either for protection or entertainment purposes)

10/27/2020 - Tuesday w/ Dr. Lu


  • Ask about expense sheet

  • Gateway North and Burchard 123 have spectrum analyzers

  • Dr. Lu is on campus Monday and Wednesday, other days we can be let in by others (Cecilia, Nicholas Dodd)

  • ABS (fish-tank area) can be a place to fly the drone

  • Outdoors at Stevens may be restricted

  • We already got 2 credits and a final grade for TG 403, but are now enrolled in IDE 401, but it is not on our transcript

  • We should not be doing a 10 page project proposal; we should be more focused on creating a minimum viable product

10/29/2020 - Thursday w/ Dr. Lu


11/3/2020 - Tuesday With Sam Salloum and Allan Wolke (of Tektronics)

  • Typically loans out devices for two week at a time

    • we may need to get a specific time frame that we are looking to get the device for

  • 4 main settings

    • stuff in the ribbon in the bottom

    • think of it as a 40 MHz bandwidth

  • Setting up the reference level

    • reference level in a SA (spectrum analyzer) is effectively the value at the top of your display

  • common error: changing the value in the top left changes the display offset not the reference level (reflev)

  • italicized text: something is being set by something else

  • You can have the spectrum analyzer work in real time (100,000s of spectrums a second so you can see the transient spectrums)

  • Displays: General Signal Viewing > DPX

    • composite result of 10,000 - 20,000 recordings a second and overlaying them

    • color shows how frequent something happened

    • Gaussian shaped: 802.11b

    • Bart Simpson head: 802.11g/n

  • Gathering Data over time (Displays: General Signal Viewing > Time overview):

    • Shows amplitude over time

    • Colored bars on the top:

      • Red: Spectrum time

      • Blue: Analysis region

  • Recommend Chirp z-transform instead of FFT

    • this is a function of time(i believe something like sampling time)

  • Small T on bottom: trigger point

    • Trigger Control panel within icon display on the top

    • Triggered instead of Free Run

  • Seems like most if not all of the displays that we wants are going to be in General Signal Viewing

  • Spectrogram:

    • new data collected is on the bottom (with older data stacked on top of it)

    • Adding Marker: allows you to take a specific spectrum and look at it in the time overview

    • Time/Div is the time between each tick mark on the left hand side of the spectrogram

      • Smaller you make the time/div the more overlap you have

  • Every display has a settings panel associated with it (brought up by settings control panel in icon bar)RF

    • seems like if you click on a specific part of the display, the settings tab will be brought up with the corresponding settings

  • Displays: General Signal Viewing > RF I&Q vs time:

    • Pan, Zoom, reset scale

  • Saving Data:

    • Acq data with setup (TIQ), saving data while being able to recall back in

    • PNG best for saving data

    • CSV will save the data in the selected display

  • Preset (top right), brings things back to the beginning

  • File > Recall > allows you to recall data and setup of a saved TIQ file (it is TIQ by default, can do otherstuff but I am not sure what)


  • Drone to handheld video component

    • First thing: What does the drone's signal format

      • DPX

      • move antenna close the the drone

        • to increase magnitude

      • setup a trigger based on what we see

      • typical pusle lengths

      • OFDM (if it is not already handled by tektronics, it will be nearly impossible to demod it)

      • center around 2.4GHz, and try to narrow from there

  • To transmit our own data

    • youtube.com/w2aew

      • RSA 306


11/4/2020 - Lab Meeting in ABS

  • Team was COVID tested as part of an on going effort to get into the ABS lab when ever possible

  • Meeting took place in ABS Senior design space

    • Using two spaces RF stations were set up

    • Rx and Tx stations were set up and Drone was deployed but NOT flown

    • Rx station utilized Tektronix RSA 306 spectrum analyzer

    • Tx side utilized Directional antenna and Pluto SDR as well as the drone and drone controller

    • IQ samples, spectrum, and signal analysis saved

    • Took measurements for antenna mounting to be modeled later

  • Plans for next week

    • Attempt to get camera to detect drone on the pi

    • Model Yagi-antenna mounting bracket in solid works

11/11/2020 - Lab Meeting in ABS

  • Team was COVID tested

  • Ben received ABS swipe access to enable weekend meetings

  • Meeting took place in ABD senior design space

    • Minicircuits power amplifier was used and was the only major change from the previous week's setup

    • Hardware was delivered and modifications to base was decided on

  • Team was successful in forcing the drone to land due to interference

    • TX (transmit) antenna was aimed at the handset, not the drone

    • TX still requires more power

    • External power to PLUTO may help

    • Better noise generation in GNU-radio may help

  • Total power roll-up required for system

    • Slip ring no longer an option

    • Antenna, PI's and stepper motor will have large current draw at 5V

    • Considering dedicated PS or battery bank at 5V to help

  • Physical components delivered to machine shop for initial modifications

    • Mounting hole locations still required

    • Hope to have mounting done for Milestone 2

  • Stepper Motor requires a motor driver

11/12/2020 - Dr. Man Meeting

  • Ben thinks we will be under FCC title 15 devices

    • Dr. Man thinks that we are not doing illegal stuff, if the range is not too far

  • We don't need to make this long range just show that it is possible

  • Goals for milestone 2:

    • image recognition plan

    • meet with Tektronix (Allan and Sam)

  • Make the antenna a phase array

    • instead of moving physically, use electronic beam forming to steer the array

  • Show that the detection systems direct the jamming systems

  • Tell Lu: he should try to pay attention to and promote this idea (i think this is in reference to the entire project)

  • Sending an e-mail to Tektronix and Kevin Lu:

    • about milestone 2

    • learning about noise generation on an SDR (asking Allan and Sam)

    • how should be advertise tektronix on our report

11/17/2020 - Tuesday With Sam Salloum and Allan Wolke (of Tektronics)

  • Showing progress with spectrum analyzer

  • Showing jamming graphs

  • Explaining goals with project

    • To stop video and/or controls

  • Discussing how we can stop video transmission

  • Rather than adjusting spectrum time in time overview, change to analysis and change analysis length to adjust capture time

  • 1MHz wide and 2 MHz wide (blue) signals are likely Bluetooth (WiFi is generally 20MHz wide) or possibly uplink command control signals from controller to drone

  • Collect TIQ files that we can go back and look at

  • If they had to pick a noise generation technique, they would use maybe amplitude modulation, or maybe an OFDM modulation w/ a random noise band

    • Depends on modulation we are doing

    • Each has on ramification in terms of side-bands

    • Don't want to waste power by dumping signals around band of interest

    • We want a nice sharp rectangular spectrum

  • If we capture a burst and carefully look at it we can determine type of modulation (like if it is OFDM)

  • Sam can provide logos and contact info in a word document

  • Ben - 'We are aware we are making a possibly illegal jammer' 'Do you think we can get away with frequency hopping?'

  • Alan - 'First understand signals characteristics so we can specifically target that signal, therefore it isn't technically a broadband jammer that will hopefully not affect other services' ' You want to make an undetectable signal that looks like the thing you are trying to defeat'.

  • We may need to do some more receiving

  • Consider ways this technology can be taken further down the line

  • Sam extended lease to December 1st

  • If we're nice to Sam we can possibly extend it a few days, not a super strict due date

  • Connor notes - use analysis and not spectrum in time. waterfall spectrogram plot to find patterns. modulation for moving noise from baseband (amp mod better, ofdm). want to find cognitive solution for disruption (freq following/reactive jamming?)

    • https://wiki.gnuradio.org/index.php/Basic_OFDM_Tutorial

11/19/2020 - Thursday With Dr. Lu and Dr. Man

  • Professor Lu would like us to apply to the L3Harris opportunity - he is confident we will win it.

    • L3Harris can also provide us with a mentor

    • Dr. Lu can also tap into previously unspent account for us

  • Speaking about how long we need the spectrum analyzer from Tektronics

  • We need to send our expensive report to get our money back

  • Dan purchased a compute stick (TPU) for the raspberry pi in the hopes of giving the pi more performance capabilities (more FPS)

  • Dr. Man wants to consider applying for a patent

    • He sees our project as very attractive that has potential to be commercialized

    • We will need a patent lawyer

    • We would own the intellectual property

    • BME students usually do this and have seen relative success getting patents

    • Dr. Lu can talk to BME professor and David Zimmerman to help (Stevens and Students can get a cut)

    • We can process a provisional first for $75 for the patent to give us a year before publishing to think about it

  • Dr, Lu asks if we plan on sticking together after graduation

  • This is a good idea and we shouldn't let it go

  • Dr. Man suggest trying to train/test image detection model on desktop first and then move pi after

  • Our camera will need a good lenses with good resolution, same for the camera used to take the pictures of the drone

  • Object detection with YOLO - https://pjreddie.com/darknet/yolo/

  • Motion detection (instead of deep network) may be better for this project in detecting the drone

    • Maybe first do motion detection and then try to classify the object that are moving to find the drone

12/2/20 - Wednesday data gathering and shop meeting

  • Team was tested for COVID at school

  • Went to the Stevens field to capture more images of the drone in flight

  • Used the RSA306B to capture more RF data of the drone

    • Different distances

    • Up/Down signals

    • Left/Right signals

  • Parts were picked up from Stevens Machine Shop

    • Stand with custom part was collected

    • Gear was attached to stand

    • Motor placement was roughed out

    • Design for mounting brackets decided on

12/3/2020 - Thursday L3 presentation meeting

  • Presentation to secure L3 Harris funding was created

    • Additional equipment was listed

    • Presentation was looked at by Dr. Lu

    • Video recording meeting scheduled for this weekend

2/16/2021 - Tuesday with Dr. Man

  • Determine the size of the image with the convolutional neural network

  • Use opencv, try to find drone and crop out rest of image

  • We can use motion detection to help find drone, things like trees can be an issue, though

  • Not sure yet if want to classify different types of drones, or also other things like birds (vs drones)

  • Tuesdays and thursdays we can meet with Dr. Man

2/17/2021 - Wednesday in Burchard

  • After receiving all of the needed components to start assembling the actual system, the team met in Burchard to start working on the system

  • Tim, Ben, and Dan took pictures and videos of the two drones (Mavic Air 2 and Mavic Mini 2) with the webcam the system will be using (around 400 photos of each and 2 videos)

  • Connor and Dan started setting up the two raspberry pi's we bought (one for object detection and another for signal detection)

  • Tom and Ben started setting up the stepper motor with Tom's raspberry pi

  • Tim and Dan started messing around with OpenCV to get started with motion detection

    • Find a script that we can start off with that does a good job of detecting motion, although currently having trouble detecting the drone (draws boxes around everything except the drone at the moment)

    • Also need to figure out how often to refresh the initial image, as this program works by comparing the initial image to all future images, so if the camera moves it needs to grab a new initial image

2/23/2021 - Tuesday with Konstantinos and Hong

  • Konstantinos suggested if we have time towards the end to create a frontend UI to display the outputs and to help visualize the system and what it's doing

  • This could be used in the presentation

  • We should start some (using one drone, etc) and get some simple goals done first, and then we can focus on stretch goals (disrupting video feed, following drone, drone classification)

  • We should include how our project is unique and better than the competition (things we mentioned in our project charter) in our final presentation

  • We should also include diagrams and make sure to document everything

3/9/2021 - Tuesday with Konstantinos

  • Showed Konstantinos our hardware mock-up with all of the components soft-mounted.

  • Also worked through all of the components and their purpose.

  • Konstantinos suggested documenting why we chose each component compared to alternatives (pros and cons of our vs others).

  • Discussed how fans (from raspberry pis) may mess with RF signals so we will try to not use them (or mount the pis differently.

  • Ben mentioned we may want to cool the power amplifiers, maybe with a heatsink and fan.

  • Explained how the camera will interface with the system - it will use motion/object detection and machine learning.

  • Tomorrow we will send 2 people to Burchard and the rest to ABS.

    • In ABS, we will drill the holes into the cake plate so we can start mountain the components.

    • In Burchard, we will continue working on the software.

    • After the work in ABS is done we can try to combine the parts.

3/16/2021 - Tuesday with Konstantinos

  • Talked about what we did last week.

    • Some stuff on GitHub, some SDR research, some object detection testing on the Pis.

  • Shared our GitHub repo with Konstantinos to show him some of the code we have been working with.

  • Talked about our schedule and when our presentation will be (likely around May).

4/6/2021 - Tuesday with Hong Man

  • Hong says using SSD is a good idea even though other options may be better (such as using OpenCV - https://learnopencv.com/object-tracking-using-opencv-cpp-python/).

  • Hong also says that if we can at the least have a proof of concept that would be good since it is clear to everyone that whatever we end up making can be improved upon.

  • We made a successful object detection model using tensorflow 2 and a ssd_mobilenet_v2_fpnlite pre-trained model and it worked pretty well, but now we need to convert it to TFlite to see if will work well on the pi.

    • This will involve quantizing the floating-point numbers to integers to optimize the model.

  • Hong suggests a way to optimize things.

    • We can use interpolation with linear prediction.

    • When the drones are far away the motion will be less than when the drone is closer which we can use to our advantage.

    • We can see what the lowest framerate we can get useful results with (do not need super high framerates).

    • Since this motion is relatively smooth we do not really need high frame rates.

4/13/2021 - Tuesday Konstantinos and Dr. Lu

  • Finalized Project Logo and Banner

  • Drafted project poster for Expo

  • Learned that there may be a 1 person limit to IRL expo

    • May switch off or have 2 people if we can get a 6ft table to ourselves

  • Finalized plans to meet in person in Burchard for tomorrow 4/14 after not meeting last week

  • Tried to meet with Konstantinos but the meeting started late and members had to leave

4/17-2021 - Saturday with Tim, Dan, Connor, and Milo

  • Met to test the beta prototype.

  • First tested the drone detection which worked surprisingly well (videos on the homepage).

  • Tested multiple drones at once which confused the system but this was fixed by having the system only follow the drone with the highest object detection percentage.

  • Took some videos from three different points of view:

    • The webcam

    • The drone

    • Dan with his iPhone

  • Noticed some low voltage warning on the pi but otherwise they seemed to work alright.

  • Also tested did signal detection - was able to detect the remote from 200+ feet.

4/25/2021 - Sunday with Tim, Dan, and Connor

  • Created a Flask website so that we can have a live stream of the drone detection to present.

  • Integrated to open on startup of the pis.

  • Initially just had the live webcam feed but after made it look prettier.

  • Added links to this site, additional pictures, the QR code to this site, the logo, and continuous reading of a text file to display the signal detection.

4/27/2021 - Tuesday with the Team

  • Finalized everything for the Innovation Expo.

  • Finished writing Milestone 3 and 4 on the site.

  • Planned on creating videos to present for the virtual expo and for L3Harris.

  • Made sure everything is working properly.