So I tried working with HPTuners and getting my WBO2 gauge to work through the AC PRessure Sensor.
I ran a quick log in the garage idling so I could see if I could capture WB data.
I was able to capture voltage through the sensor but it only read voltage for part of the log. In the attached log, the car is started at 20 seconds and once the sensor warms up the AC Pressure Sensor voltage varies until about a minute in when it just goes back to reading a constant 5v just like it was reading before I started the car and it warmed up.
I didn't have the chart added during the capture and it doesn't look like it saved to the file, but I added 2 WBO2 charts - 1 using the Maths from the UEGO instructions and 1 using Maths from a video demonstrating how to use the AC Pressure Sensor. Both were showing Lambda values higher than what my gauge was showing (i.e. when my gauge was showing .8-ish the formula shows over 1.0). I switched my gauge over from AFR to Lambda and used the Lambda scaling formula listed below first. After that, I tried the formula used in a video that divides the difference between the first and last value in the table over the voltage range. It was similar to AEM's formula but resulted in slightly higher values. I'm not sure I understand how to determine the adjustment - or is it just trial and error until the gauge and HPT matches?
I've also attached the log file if anyone wants to play along at home.
The Maths I used were:
(7101.10]*0.1621)+0.4990 per the scaling formula from AEM shown above. This was 2-3 tenths higher than what my gauge was showing.
([7101.10]/6.154)+0.58 This is per the video that said to divide the voltage range (0.5 - 4.5) by the output range (0.58-1.23) and add the first number from the output range. This was almost another tenth higher than the first formula.
I hope to get some logging done this weekend when I am actually driving, but I'm concerned that the voltage was only reporting for part of the short idling session.