chipKIT® Development Platform

Inspired by Arduino™

new operator can only allocate 1000 bytes???

Created Fri, 06 Apr 2012 02:19:17 +0000 by Al Bethke


Al Bethke

Fri, 06 Apr 2012 02:19:17 +0000

When I use the new operator to create an integer array I find that I cannot make the array larger than about 200 integers. The code below works fine, but it fails when I set N to 300.

// Test new // Al Bethke 2012/04/04

#define N 200

void setup() { Serial.begin(9600); Serial.println("Test new"); int* a = new int[N]; for (int i = 0; i < N; i++) { a[i] = i; } Serial.print("a["); Serial.print(N - 1); Serial.print("] = "); Serial.println(a[N - 1]); Serial.end(); delete [] a; }

void loop() { }

Does anyone know what I'm doing wrong or how to allocate a large array with new?

Thanks.


lloyddean

Fri, 06 Apr 2012 04:42:35 +0000

No help here, but it can't allocate more memory than is available.

remaining = RAM - stack - globals

so what board is this running on?


Al Bethke

Fri, 06 Apr 2012 16:48:29 +0000

I was using a Max32 board. I was able to statically allocate an array of 30000 ints, consistent with the 128K of data memory that the Max32 is supposed to have. When I tried allocating a big array of int using the new operator, I found I could not do that.

I tried this using the last 2 versions of the MPIDE with the same result. When i use N=300, the program dies after the initial println appears on the serial monitor. No error message, it just doesn't print anything else.

Can someone point me to the documentation for the version of GCC that is being used here? Maybe there is a compiler option to change the initial size of the heap.

Thank you.


lloyddean

Fri, 06 Apr 2012 20:31:39 +0000

With Mpide released Dec 21, 2011 - gcc version 4.5.1 chipKIT Compiler for PIC32 MCUs v1.30.01-20110607

Do you mind showing your code that does the allocation?


Al Bethke

Fri, 06 Apr 2012 21:15:30 +0000

This is the entire sketch:

// Test new // Al Bethke 2012/04/04

#define N 200

void setup() { Serial.begin(9600); Serial.println("Test new"); int* a = new int[N]; for (int i = 0; i < N; i++) { a[i] = i; } Serial.print("a["); Serial.print(N - 1); Serial.print("] = "); Serial.println(a[N - 1]); Serial.end(); delete [] a; }

void loop() { }

It works with N=200, it fails with N=300. The call to new is in the setup function. There's just that one call to new.


lloyddean

Sat, 07 Apr 2012 18:21:00 +0000

I'm able to duplicate your issue down to allocating 254 integers.

EDIT 1: See post <http://www.chipkit.org/forum/viewtopic.php?p=4142#p4142>

EDIT 2: It seems that by default the linker doesn't provide any heap space from which to do dynamic heap allocations. This means the programmer must allocate the heap if they wish to do ANY memory allocation at runtime.

The following example shows how and provides debugging output to see what's going on.

//#define NDEBUG
#define DEBUG_PRINT


#define CHANGE_HEAP_SIZE(size)  __asm__ volatile ("\t.globl _min_heap_size\n\t.equ _min_heap_size, " #size "\n")

CHANGE_HEAP_SIZE(65536);


#ifndef DEBUG_PRINT
    #define DebugPrint(X)
    #define DebugPrintln(X)
#else
    #define DebugPrint(X)     Serial.print(X)
    #define DebugPrintln(X)   Serial.println(X)
#endif


#if !defined(NDEBUG)

void* operator new(size_t const n)
{
    DebugPrint("operator new: ");
    DebugPrint(n);
    DebugPrintln(" Bytes");

    void*   m = malloc(n);
    if ( ! m ) { DebugPrintln("out of memory"); }
    return m;
}

void* operator new[] (size_t const n)
{
    DebugPrint("operator new[]: ");
    DebugPrint(n);
    DebugPrintln(" Bytes");

    void*   m = malloc(n);
    if ( ! m ) { DebugPrintln("out of memory"); }
    return m;
}

void operator delete(void* const m)
{
    DebugPrintln("operator delete");
    free(m);
}

void operator delete [](void* const m)
{
    DebugPrintln("operator delete[]");
    free(m);
}

#endif



void loop()
{}

void setup()
{
    Serial.begin(9600);


    // ---- POD allocations

    Serial.println("creating &amp; destroying an int");
    int*    p = new int;
    delete p;


    const size_t    N = 2000;

    Serial.print("\n\ncreating &amp; destroying an int[");
    Serial.print(N);
    Serial.println("]");
    int*    a = new int[N];
    delete [] a;


    // ---- object allocations

    class obj_t {
        uint32_t    _i[100];

    public:
        obj_t()     { Serial.println("obj_t::obj_t()");  }
        ~obj_t()    { Serial.println("obj_t::~obj_t()"); }
    };

    Serial.println("\n\ncreating &amp; destroying an obj_t");
    obj_t*  s = new obj_t;
    delete s;
    

    Serial.println("\ncreating &amp; destroying obj_t[3]");
    obj_t*  array_objects = new obj_t[3];
    delete [] array_objects;
}

lloyddean

Sun, 08 Apr 2012 04:42:46 +0000

Just a note that I modified the above post to anyone who may be interested.


Al Bethke

Sun, 08 Apr 2012 15:40:47 +0000

Thank you! This is exactly what I needed. However, I did notice one very strange thing.

I took your code for changing the heap size and pasted it into my program and things worked fine except I still ran out of heap space much faster than I expected. So I played around with the very nice little demo program you posted. I can set N to 16380 without running out of (heap) memory, consistent with the expected size of 64K. But if I comment out the three lines of code in your setup function where you use new to allocate a single integer, I can only make N about 8000 before I run out of memory!

If I allocate a single int first, then I can allocate (a few bytes less than) 64K for the array. But if I allocate the array without first allocating (and deleting) that single int, I can only allocate about 32K for the array.

I regard this as a bug, but it is easy to work around. I am a little surprised that the heap must be statically allocated at the beginning of the program. I thought the stack and heap grew towards each other in the space left over after the global objects are allocated.

Thanks again, you have saved me much time and effort.


lloyddean

Sun, 08 Apr 2012 16:08:31 +0000

Interesting!


KeithV

Wed, 11 Apr 2012 15:31:56 +0000

Be very careful about increasing the size of the heap to a large value, there are no checks for when the heap runs into the stack. It is very easy with large heaps to have the stack grow into and over-write data in the heap. Profile your maximum stack depth first (print the address of a stack variable to the serial monitor at the deepest point), then set your heap size.

Another concern is that heaps can easily fragment. Your example of allocating and freeing first to increase the size of the help is very interesting, but probably because of how small and large data is allocated within the heap. It is common, and not concidered a bug, for heaps to become fragmented. Good embedded programing will not use a heap because fragmentation is a reliability problem.

Good luck!