Storage
eMMC module (Optional industrial compatible high performance eMMC module, 16GB/32GB/64GB/128GB available)
μSD card (μSD slot supports up to 128 GB μSD card)
Display
HDMI 2.0 up to 4k@30
HDMI EDID feature for proper display detection
++++++++++++++++++++++++++++++++++++++++++++++
Then, after you install Win-blows 10, go to Debloat Windows in 2022 and run his Windows debloating script. Takes out all the Microsoft spyware and keeps crap from running in the background sucking up all the RAM space.
it plugs into the back of a monitor with all connections available at the back of the monitor. I ordered both and they should be here next week. I ordered an i5 M710q but there are many options from i3 to beyond my budget. Peter
Instant success (well a couple of hours). The camera worked immediately when plugged into the Pi (which was connected to a Win10 laptop running QGC- QGroundControl).
The Ping Sonar was a bit trickier and needed drivers installed to get it working when connected to USB on the Win10 running PingViewer. I’m really not sure what I did as so often happens with MS Windows.
It does not autodetect. I have to manually connect each time. It may have to do with running the transducer in a small bucket of water below minimum range.
Next i connected the Ping to the Pi computer to get the signal up the Ethernet cable between the Pi and the laptop. Again I had to manually connect but it does work and I get both camera and sonar on my laptop. See photo.
Note I have not yet connected the FathomX at each end of the cable. I am just using a 6ft Ethernet cable.
Question: Can the Ping viewer be a window within QGC the way the Camera image is? What is he best way to view the camera and sonar together? Edit: re-reading above “Ping Viewer is an application, and whatever you choose to view your video in will be a separate application, so they’re in separate windows” so by arranging the windows I can see an inch of the Pingviewer and click on it if I want to see the whole thing. It seems a touchscreen would be useful here instead of fumbling with a mouse.
Question: Any suggestions on getting the connection to be automatic?
I have some reading to do tomorrow.
I tried to get some idea of what the Companion Pi software looks like.
It is maintained at Github.
What is Github?
How to use.
Learning to use Github is a hill I’m not prepared to climb.
I downloaded and extracted the files which make no sense to me.
I gather the program(s) are written in various languages:
Where are the software design plans and flowcharts? Surely there is a master program that calls functions even if written in a different language.
Is this the forum to ask these questions?
If I learn Python and Pymavlink I don’t see how I can use my PIC to talk to the Pi companion computer to simulate the Pixhawk and send Pitch and Compass. Can I do the same thing in C by learning the Mavlink protocol?
I have already accomplished what I set out to do by improving my camera quality and adding a Sonar.
How much data is displayed from the Pyxhawk in a normal system?
Perhaps it is not worth months of work improving features I already have, even if they are slightly inconvenient.
I complement Blue Robotics on very nice products.
Peter
Darrell; Thanks for the RockPiX suggestion. As I said I decided to stay away from any computer that makes more work for me. I am going with a Lenovo Tiny running Win10 Pro to be compatible with Blue robotics products. The decision has already paid off in instant success as I mentioned.
How much of QGroundControl and Ardusub software are you using? How did you integrate it into your system?
Your ROV sound very interesting. Unfortunately mine is not compatible with Ardusub control. My joystick is software coupled to my side thruster rotate motors and rudder and elevator. When I push it forward the elevator goes down and both side thrusters rotate up. Pushing to the left slows or reverses the left thruster and speeds up the right thruster. That configuration is not in the Ardusub configuration although I was hoping to contact someone who has a similar system. Some of the autonomous subs are torpedo shaped but don’t have the side thrusters, just rudder and elevator behind a single propeller. They cannot hover.
I have a hover switch that places the side thrusters in reverse balanced by the rear thruster in forward. The throttle will move the ROV forward or backward and the joystick will control pitch and turning.
I am looking to do the least amount of modifications for maximum benefit. Improving my camera and video drive and adding a Sonar for a couple of hours of development time on my part is a huge improvement. I still have to install it all but that is minor machine shop work.
Your input is greatly appreciated, Peter
Any signals from surface to ROV are on a twisted pair in the Cat5 tether. Any data returned is on the same twisted pair (RS485). NOT ON THE ETHERNET CONNECTION or using QGC.
Data sent includes three thruster speeds and directions, side thruster rotate positions, Rudder and Elevator positions, and various switch and pot positions (lights, throttle fwd/rev/speed, auto dive/climb angle, etc).
Data returned are Heading, Pitch, Roll, Depth (Bar30), Bat volts and current, and alarms/status including over-currents and leaks (main tube, port side pod, and stbd side pod).
I cannot easily do anything about the data sent from the surface to the ROV but it would be nice to integrate some of the returned data to be displayed in QGroundControl.
The difficult way to return data is to use my PIC to communicate with the Companion RPi USB with MAVlink protocol. This would be big learning curve for me.
The easy way to return data is to add a Pixhawk but only use what I need. I can replace the following items that are working on my own system with the Pixhawk equivalent.
Compass/pitch/roll, Bar30 depth, Power sense for Battery volts and current (I’m not sure if it calculates remaining amp/hours %). Any other suggestions?
I am not crazy about adding another load (500mA ?) from the Pixhawk. The RPi, Ping, Camera, and Fathom X current draw is starting to add up compared to my insignificant PICs and moderate thruster currents.
Questions:
Does this plan sound feasible?
Can I buy the Pixhawk loaded with Blue Robotics software?
Only the Pixhawk 1 is fully tested and supported
Pixhawk Mini has been reported to work with ArduSub
If I buy a Pixhawk 1 how do I program it? My research turned up this, but not from any instructions:
I also researched the autoconnect problem with the Ping Sonar and found this:
Verify QGC Autoconnect settings
Make sure that the QGroundControl is configured to automatically connect to UDP and USB links. Click on the ‘Q’ icon in the upper left to view the Application Settings. Click on the ‘General Settings’ tab. In the options for ‘Autoconnect to:’, make sure the UDP option is checked. (To see this drag the window up from where it is hidden).
Mine is checked but, as I mentioned before, I still have to click the top line to connect.
I would like confirmation from someone that what I propose will work.
I will not be using the Pixhawk to control thrusters or anything else. I assume it has the gyro and accelerometer and compass built in for plug-and-play? Also does it have the Bar30 software to autoconnect and feed the data to the QGC?
Edit: Al I can find typically on Ebay is “Pixhawk PX4 PIX 2.4.8 32 Bit Flight Controller Only Board Without TF Card RC”. Any idea if these would work? What is TF Card? WOULD THEY HAVE THE GYRO/ACC/COMPASS?
Nice, looks reasonably capable, hope it works well for you
It should still auto-detect even in air, although it does sometimes take a little while to do so. Normally if the System page of the companion web-interface shows an active service of Ping360-... then PingViewer should also be able to detect it. If it doesn’t show that service then you’ve most likely got a connection problem, or need to power-cycle the system (ideally the companion computer and Ping360 should turn on at the same time, so that the Ping360 device is available when the RPi starts looking for it, and so that there’s no risk of a previous dropped connection being maintained by the Ping360 at the cost of refusing new connections).
When using it in the past I tended to have QGC (with controls and primary video stream) on half of the screen, and a vertical split between PingViewer and OBS (showing a secondary video stream) in the other half of the screen. Unfortunately QGC isn’t particularly modification friendly (the code base is quite interconnected, and there’s lamentably no support for external widgets that can just be displayed in the window). This topic might be of interest, but the software presented there isn’t free.
It depends how it’s connected. You’ve mentioned you’re connecting it through the companion computer, so for that the companion computer will automatically detect and connect to the Ping device, then provide an interface to it over UDP at port 9090 which PingViewer picks up on. I’m assuming the two Serial options are still there from you plugging the Ping360 directly into your computer, although I’m not sure why they’re still showing up as green.
There isn’t such a diagram that’s readily available, but the code is effectively treated as as a bunch of separate functionalities, which are run on startup is through .companion.rc, which is a bash script. Each functionality is run as a linux screen session, which is basically a persistent terminal that runs in its own thread, and can be reconnected to later. In companion those sessions are given meaningful names (just after the -S flag), and the currently active sessions are what’s displayed as the “Active Services” section of the web interface’s System page. If you create your own named screen sessions, they will also show up there.
You mentioned previously that your PICs communicate via I2C. You can use I2C with the Raspberry Pi, so you can make a Python script that communicates with I2C to your PIC, and then uses Pymavlink to send mavlink messages to the top side.
Mavlink uses generated APIs, of which there’s one for Python (pymavlink), and one for C, and more for other languages. You shouldn’t need to learn the protocol itself, but will instead need to be able to make a connection, and send the relevant mavlink messages. I suggested Pymavlink because it’s already installed on the companion computer, and because Python tends to make things easy to develop quickly (which is helped in this case by the ArduSub pymavlink examples and the guide/post I linked you to earlier). C should still work, it’s just harder and comes with likely little gain.
Depends on the system, and user preferences. QGroundControl allows specifying a desired data rate, and you can also turn different indicators on or off depending on what you want to see (/what you have access to) while operating.
As mentioned, the RPi can use I2C, you don’t need to make a USB connection or send mavlink messages to it from your PIC, but to see the values in QGC you’ll need to use the RPi to send mavlink messages that include your sensor data.
This would likely simplify things, but as you’ve mentioned uses extra power to replace functionality you already have, which seems a shame. Power sense at at least estimates battery percentage, and from memory might provide a remaining time estimate, but I can’t check right now unfortunately.
The Pixhawk 1 is actually past end of life, so isn’t manufactured anymore and is generally only sold in existing kits that make use of it. Our engineers are hard at work making an alternative that’s future proof and provides significant other benefits, but until that’s available we can only really recommend that you try a pixhawk model that’s known to work - unfortunately we haven’t tested any extensively and can’t support them as a result. From what I understand the Pixhawk 4 is the closest match, but ArduSub might not support some of its extra features that were introduced since the Pixhawk 1.
Programming the Pixhawk can be done through both QGroundControl (as in your photo) or the companion web-interface. If you’re not modifying ArduSub it’s generally as simple as clicking the update button. If you are building a custom version then you basically just upload your custom firmware once you’ve built it.
These are the instructions for enabling QGC to auto-connecting to videos, and is unrelated to the Ping devices (which QGC knows nothing about).
Pixhawk still sends sensor readings without needing to be armed or to control anything, so that side of things shouldn’t be an issue. Yes the gyro, accelerometer, and compass are built in. ArduSub is designed to communicate with the Bar30 over I2C (using the I2C port of a Pixhawk board), so yes, it auto-connects and sends its data.
If you did decide to get a Pixhawk, you may wish to get something like an I2C bus splitter and read in all your PIC data through I2C there instead of directly from the companion. To do that you’d need to create a modified build of ArduSub so that it can communicate with and understand your PICs, which I imagine wouldn’t be any easier than using the companion computer to do the same, but it might mean that you wouldn’t need to work with the mavlink side of things (I’m not intimate with how ArduSub is set up). That’s unfortunately not very helpful if the Pixhawk is there specifically to replace the PIC and corresponding sensors.
From a google, TF Card is a MicroSD card, so you’d need to buy that separately. Those IMU sensors should still be there - I don’t believe they can be detached.
I have started a new thread in an attempt to find a suitable Pixhawk with which to experiment:
BlueROV2 and ArduSub “Pixhawk 1 replacement”
I made some measurements of current draw for my present setup to help in future decisions.
Bare Raspberry Pi 3B 230 mA
Blue Robotics Camera 350 mA
Ping Sonar 90 mA
The Ethernet cable connection seems to add in the order of 30 mA to the Pi when connected
The total of the above working is a steady 670 mA
I have not connected or measured the Fathom X yet. Has anyone measured the current draw?
Has anyone measured the Pixhawk current draw? Which model? I guess it will depend what is connected to it so a breakdown would be nice.
I bought the Advanced ROV Electronics Package for ArduSub in 2021, which included a Pixhawk and RPi3B, and have everything installed.
Bad timing as I just missed the new Navigator Flight Controller.
As mentioned earlier I now have duplicate compass and Pitch/Roll data.
Is it possible to get at the data generated by the Pixhawk by asking the Raspberry Pi on I2C?
Will the RasPi recognize my connection and respond. What is the protocol?
Do I have to modify the firmware in the Pi?