chipKIT® Development Platform

Inspired by Arduino™

Serial communication

Created Thu, 21 Jul 2016 15:12:07 +0000 by eldoori


eldoori

Thu, 21 Jul 2016 15:12:07 +0000

Hi,

I am currently doing a student project with six ultrasonic sensors and the chipKIT MAX32 board. These ultrasonic sensors are connected to a "black box" and this "black box" is then connected through serial to the chipKIT. The "black box" sends out 6 data bytes, every byte corresponds to the distance measured in one sensor. Now I want to know the pace of which the "black box" is sending me data and which byte the "black box" sends is the first byte of the sequence, however I have a few problems just trying to figure this thing out.

First problem: The timer that I used is a millis() based timer, however it returns that 0 milliseconds have elapsed between 2 bytes. When changed to micros() I get all kinds of weird numbers, read random numbers, even if I comment out the Serial.read() and just put a delay(1) I get random numbers. For example the first reading gives 843 microseconds and the second reading gives 43 microseconds when expected to be (close to) 1000 microseconds.

Second problem: When I measure the time between two data bytes I get values even if I disconnect the Serial wire from the chipKIT, I think that this is because of noise and I have searched online but I haven't found someone with the same problem. When I connect the Serial wire to an oscilloscope I don't really see that much noise, so I think that it has to do with the Serial port of the chipKIT. Did anyone have the same problem?

Third problem: Since I do not have a good reading on how much time there is between 2 data types, I also can't know what the entire cycle time is and without it I can't determine which data byte is the first byte in the cycle.

Here is my code (sorry if it is a bit messy):

unsigned long start, finished, elapsed;

void setup()
{  
  Serial3.begin(9600);
 Serial.begin(115200);
}

void loop()
{  
 if (Serial3.available() > 0) {
  start = micros();
   unsigned char data = Serial3.read();
  // delay(1);
   finished = micros();
   elapsed = finished - start;
   Serial.println(elapsed);
   Serial.println(data);
  }
}

Can someone help me?


majenko

Thu, 21 Jul 2016 16:17:56 +0000

Serial data is buffered. You are just measuring how long it takes to read a byte from memory. Instead you need to remember the last time that serial.available() was not 0 island subtract that from the current time. No guarantees that it will give accurate results though since there could be multiple bytes in the buffer to read.

You may have to hook into the serial reception interrupt to measure the timing more accurately. I added a function for doing that a while back, but can't remember what it was at the moment. I am out of the lab at the moment but can look it up when I get back later on.

Sent from my One Mini 2 using Tapatalk


majenko

Thu, 21 Jul 2016 21:58:59 +0000

Ok, back in the lab now. The function I added was Serial.attachInterrupt(...). It triggers an interrupt when a character is received which you then receive in the interrupt routine.

volatile int thisChar = 0;
volatile bool hasChar = false;
volatile uint32_t charTime = 0;

void mySerialInt(int ch) {
    static uint32_t lastChar = 0;
    charTime = millis() - lastChar;
    lastChar = millis();
    thisChar = ch;
    hasChar = true;
}

void setup() {
    Serial.begin(115200);
    Serial3.begin(9600);
    Serial3.attachInterrupt(mySerialInt);
}

void loop() {
    if (hasChar) {
        uint32_t s = disableInterrupts();
        int myChar = thisChar;
        int myCharTime = charTime;
        hasChar = false;
        restoreInterrupts(s);
        Serial.print(myChar, DEC);
        Serial.print(" ");
        Serial.print(myCharTime);
        Serial.println("ms");
    }
}

Note: I haven't tested this - I don't even know if it compiles ;) Basically the theory is:

  1. A character is received and triggers the interrupt.
  2. The ISR receives the interrupt and stores it in thisChar and calculates the time since the last character and stores that in charTime.
  3. The flag hasChar is set to true to indicate to the main loop that thisChar has a new value.
  4. The main loop sees thisChar then disables interrupts (it enters a critical section) and copies the character and it's time into local variables for use. It then restores interrupts.
  5. The results of the reception are printed.

eldoori

Mon, 22 Aug 2016 15:32:35 +0000

Thank you majenko for your response.

I would like to apologize for my late response, I was out of the country for a couple of weeks.

About the code, after a few alterations to make it fit in my application and it works! However, I want to know why it works. I understand the concept of the software interrupt that you made, but doesn't the chipkit automatically handles all serial data through its hardware interrupt? Why do you need an additional software interrupt?

Thanks in advance!


majenko

Tue, 23 Aug 2016 09:17:22 +0000

Yes, the core handles the hardware interrupt and receives the data. That is all fine. It then stores that data into a buffer for you to receive as and when you see fit. That completely disconnects the reception of the character from the reading in your sketch, so you cannot know when the character has arrived, only that it has arrived.

This is not so much a "software interrupt" as a "hook" into the existing interrupt routine. It intercepts the incoming character as it arrives and, instead of storing it in a buffer for you it, runs your routine to handle the incoming character as you see fit - reconnecting the reception of the character and the reading in your sketch. As the character arrives in hardware your routine is instantly called for you to deal with that character.


eldoori

Tue, 23 Aug 2016 10:59:01 +0000

Thank you once again majenko, you have been of great help and I find your way of explaining things very easy to understand. I'll let you know how it ends!