Megalextoria
Retro computing and gaming, sci-fi books, tv and movies and other geeky stuff.

Home » Digital Archaeology » Computer Arcana » Atari » Atari ST » GNU/GCC optimizing
Show: Today's Messages :: Show Polls :: Message Navigator
E-mail to friend 
Switch to threaded view of this topic Create a new topic Submit Reply
GNU/GCC optimizing [message #301487] Wed, 30 September 2015 17:45 Go to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi,

I'm developing since 1987 a shareware called Eureka 2.12 on different
successive ATARI computers, and since GNU/GCC 2.8.1 was released, I
also use that C language compiler under freeMiNT, the ATARI free OS.

My C sources are perfectly compatible with GNU/GCC until 3.3.6 version.
But since GNU/GCC 4 optimizing compiler is available, I can't build my
sources successfully. Large parts of what it's written in C language
are completely discarded from the binary even if I use the "-O0" option.

Is there a way to force GNU/GCC 4 (I'm experimenting 4.6.4 version) to
build strictly what it is written in C, preventing from optimizations ?

For the moment, the Eureka 2.12 software can be built, but the binary
is not corresponding to sources, and is totally misbehaving with
GNU/GCC 4, because my sources are misinterpreted. GNU/GCC 3 is correct.

I was imagining that C language is offering sources compatibility, but
since the fourth version of GNU/GCC, it isn't the case apparently with
my old sources. The problem seems to be caused by the compiler itself,
and not the corresponding libraries, because I even tested GNU/GCC 4
with the exact same libraries than with GNU/GCC 3 ...

Thanks in advance for helping me.

Best regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #301518 is a reply to message #301487] Thu, 01 October 2015 02:18 Go to previous messageGo to next message
Miro Kropáček is currently offline  Miro Kropáček
Messages: 23
Registered: November 2012
Karma: 0
Junior Member
> sources successfully. Large parts of what it's written in C language
> are completely discarded from the binary even if I use the "-O0" option.
>
> Is there a way to force GNU/GCC 4 (I'm experimenting 4.6.4 version) to
> build strictly what it is written in C, preventing from optimizations ?
>
What does that even mean? How discarded? You rely on dead code?

The main change in the 4.x series is introduction of so called strict aliasing, see for instance here: http://cellperformance.beyond3d.com/articles/2006/06/underst anding-strict-aliasing.html. You can disable it via a command line switch, however, this is recommended only as a temporary workaround.

Also, some code example would clarify more.
Re: GNU/GCC optimizing [message #301558 is a reply to message #301518] Thu, 01 October 2015 15:41 Go to previous messageGo to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi MiKRO,

Miro Kropáček writes:
>> sources successfully. Large parts of what it's written in C language
>> are completely discarded from the binary even if I use the "-O0" option.
>>
>> Is there a way to force GNU/GCC 4 (I'm experimenting 4.6.4 version) to
>> build strictly what it is written in C, preventing from optimizations ?
>>
> What does that even mean? How discarded? You rely on dead code?
>
> The main change in the 4.x series is introduction of so called strict aliasing, see for instance here: http://cellperformance.beyond3d.com/articles/2006/06/underst anding-strict-aliasing.html. You can disable it via a command line switch, however, this is recommended only as a temporary workaround.
>
> Also, some code example would clarify more.

Well, I took GNU/GCC 4.6.4 available on your WEB page at :
< http://mikro.naprvyraz.sk/files/gcc/gcc-4.6.4-m68020-60mint. tar.bz2>
precisely `cc1` binary available in this archive, because I have a
Hades060 machine. I replaced the `cc1` binary from my GNU/GCC 3.3.6
configuration available at <http://eureka.atari.org/gcc3.3.6SDK.zip>
from your `cc1` binary. I have then a GNU/GCC 3.3.6 configuration,
with the C compiler from 4.6.4 version.

With this development configuration, I successfully built Eureka 2.12,
with not so much warnings. The problem is the resulting binary is not
conform to sources. Please keep in mind that my Eureka 2.12 requires
the "-mshort" option from GNU/GCC for PURE C 1.1 compatibility issue.

This weird manipulation was intended to prove that the compatibility
problem from GNU/GCC 3 to GNU/GCC 4 is only due to the `cc1` compiler.
Also, with the available 4.6.4 version, there's no 16bits libraries.

I also put in the `Makefile` a "-O0" compiling option. The problem is
that resulting binary is misbehaving compared to the 3.3.6 compilation.

I don't know what to do. I also tested the cross-compiler under OS X.
My sources seem to be totally misinterpreted by GNU/GCC 4, I'm afraid.

You may have an opinion ...

Thanks for your answer.

Best regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #301594 is a reply to message #301558] Fri, 02 October 2015 02:12 Go to previous messageGo to next message
Miro Kropáček is currently offline  Miro Kropáček
Messages: 23
Registered: November 2012
Karma: 0
Junior Member
> Well, I took GNU/GCC 4.6.4 available on your WEB page at :
> < http://mikro.naprvyraz.sk/files/gcc/gcc-4.6.4-m68020-60mint. tar.bz2>
> precisely `cc1` binary available in this archive, because I have a
> Hades060 machine. I replaced the `cc1` binary from my GNU/GCC 3.3.6
> configuration available at <http://eureka.atari.org/gcc3.3.6SDK.zip>
> from your `cc1` binary. I have then a GNU/GCC 3.3.6 configuration,
> with the C compiler from 4.6.4 version.
What if take the whole package? There may be some internal dependencies, there's no guarantee that cc1 stays binary compatible with previous versions.

Also, you still didn't explain what do you mean by that 'the resulting binary is not conform to sources'.

But you're right, there's no -mshort libc & friends, only basic libgcc (for building the freemint kernel).
Re: GNU/GCC optimizing [message #301666 is a reply to message #301594] Sat, 03 October 2015 10:07 Go to previous messageGo to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi MiKRO,

Miro Kropáček writes:
>> Well, I took GNU/GCC 4.6.4 available on your WEB page at :
>> < http://mikro.naprvyraz.sk/files/gcc/gcc-4.6.4-m68020-60mint. tar.bz2>
>> precisely `cc1` binary available in this archive, because I have a
>> Hades060 machine. I replaced the `cc1` binary from my GNU/GCC 3.3.6
>> configuration available at <http://eureka.atari.org/gcc3.3.6SDK.zip>
>> from your `cc1` binary. I have then a GNU/GCC 3.3.6 configuration,
>> with the C compiler from 4.6.4 version.
>
> What if take the whole package? There may be some internal dependencies, there's no guarantee that cc1 stays binary compatible with previous versions.

If I take the whole package, I would have the same result like with
cross-compilers. The building configuration is not adapted to compile
my Eureka 2.12 software, because there's no 16bits libraries. Also
GNU/GCC 4 is an optimizing compiler, that misinterpret my sources.

It seems to me that `cc1` stay compatible from a version to the other.

The developing configuration that I'm telling you about successfully
builds my software, with correct 16bits libraries from 3.3.6 version.
That's why I can tell that `cc1` compiler is not backward compatible.

> Also, you still didn't explain what do you mean by that 'the resulting binary is not conform to sources'.

Well, the starting demo with a spinning hypercube is not played. The
GEM interface seems correct, but if I want to describe a curve, the
curve is not drawn. If I want to draw a surface, the surface is
not drawn. Nothing happens with the binary like it should. The
binary is simply not corresponding to sources. The program is broken.

> But you're right, there's no -mshort libc & friends, only basic libgcc (for building the freemint kernel).

The lack of 16bits libraries is a big default when I build Eureka 2.12.
It breaks compatibility with earlier ATARI developing configurations. I
didn't know the freeMiNT's kernel still uses the "-mshort" option. Many
ATARI programs are certainly using it because it is an ATARI convention.

Thanks for helping.

Best regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #301695 is a reply to message #301666] Sat, 03 October 2015 18:20 Go to previous messageGo to next message
Anonymous
Karma:
Originally posted by: Michael Schwingen

On 2015-10-03, Francois LE COAT <lecoat@atari.org> wrote:
> Well, the starting demo with a spinning hypercube is not played. The
> GEM interface seems correct, but if I want to describe a curve, the
> curve is not drawn. If I want to draw a surface, the surface is
> not drawn. Nothing happens with the binary like it should. The
> binary is simply not corresponding to sources. The program is broken.

Im my experience with optimizing compilers, in most such cases the fault is
not with the compiler, but instead the source code is broken, doing things
that are not allowed by the C standard and relying on undefined behaviour.

If it worked with the older compiler (that had a weaker optimizer) that does
not mean anything for the correctness of the source code.

Did you compile with all warning enabled, and look at the warnings? I can't
believe old, misbehaving code would compile after a gcc3 -> gcc4 switch
without producing at least some new warnings!

cu
Michael
Re: GNU/GCC optimizing [message #301711 is a reply to message #301695] Sun, 04 October 2015 09:15 Go to previous messageGo to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi,

Michael Schwingen writes:
> Francois LE COAT wrote:
>> Well, the starting demo with a spinning hypercube is not played. The
>> GEM interface seems correct, but if I want to describe a curve, the
>> curve is not drawn. If I want to draw a surface, the surface is
>> not drawn. Nothing happens with the binary like it should. The
>> binary is simply not corresponding to sources. The program is broken.
>
> Im my experience with optimizing compilers, in most such cases the fault is
> not with the compiler, but instead the source code is broken, doing things
> that are not allowed by the C standard and relying on undefined behaviour.
>
> If it worked with the older compiler (that had a weaker optimizer) that does
> not mean anything for the correctness of the source code.
>
> Did you compile with all warning enabled, and look at the warnings? I can't
> believe old, misbehaving code would compile after a gcc3 -> gcc4 switch
> without producing at least some new warnings!

You'll agree that it's very peculiar ... How bizarre a warning will
generate an error when building C program's sources ? This is not
a warning, but should be alerted as an error, don't you think so ?
I never seen before in the C standard definition that a warning
should imperatively be taken into account, otherwise generating an
error. The warnings must often, used to generate good code, not errors.
Strong optimizations mean that warnings are now considered as errors ?

Please take into account that I practice C language since 1986,
first Kernighan and Ritchie, then ANSI C standard. Notice the
C standard must have evoluted because my C sources are now obsolete.

Thanks for your answer.

Best regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #301717 is a reply to message #301711] Sun, 04 October 2015 14:19 Go to previous messageGo to next message
Anonymous
Karma:
Originally posted by: Michael Schwingen

On 2015-10-04, Francois LE COAT <lecoat@atari.org> wrote:
>
> You'll agree that it's very peculiar ... How bizarre a warning will
> generate an error when building C program's sources ? This is not
> a warning, but should be alerted as an error, don't you think so ?

No. The C standard gives compiler writers considerable room in which way
certain details may be implemented. If your source code uses constructs with
undefined results, the results will be - well, undefined, and may change
with compiler versions.

> Please take into account that I practice C language since 1986,
> first Kernighan and Ritchie, then ANSI C standard. Notice the
> C standard must have evoluted because my C sources are now obsolete.

It has - K&R left many things undefined which were better specified in ANSI
C, however, even K&R had language details which caused undefined behaviour
by definition, and which may now cause unexpected results due to improving
compiler optimizations that uncover faults that were always there.

As a start, have a look at
http://blog.regehr.org/archives/213
http://blog.llvm.org/2011/05/what-every-c-programmer-should- know.html

cu
Michael
Re: GNU/GCC optimizing [message #301727 is a reply to message #301711] Sun, 04 October 2015 16:20 Go to previous messageGo to next message
dave[1][2] is currently offline  dave[1][2]
Messages: 119
Registered: January 2012
Karma: 0
Senior Member
On 04/10/2015 14:15, Francois LE COAT wrote:
> Hi,
>
> Michael Schwingen writes:
>> Francois LE COAT wrote:
>>> Well, the starting demo with a spinning hypercube is not played. The
>>> GEM interface seems correct, but if I want to describe a curve, the
>>> curve is not drawn. If I want to draw a surface, the surface is
>>> not drawn. Nothing happens with the binary like it should. The
>>> binary is simply not corresponding to sources. The program is broken.
>>
>> Im my experience with optimizing compilers, in most such cases the
>> fault is
>> not with the compiler, but instead the source code is broken, doing
>> things
>> that are not allowed by the C standard and relying on undefined
>> behaviour.
>>
>> If it worked with the older compiler (that had a weaker optimizer)
>> that does
>> not mean anything for the correctness of the source code.
>>
>> Did you compile with all warning enabled, and look at the warnings? I
>> can't
>> believe old, misbehaving code would compile after a gcc3 -> gcc4 switch
>> without producing at least some new warnings!
>
> You'll agree that it's very peculiar ... How bizarre a warning will
> generate an error when building C program's sources ?

This is typical of "C" and is one of the problems with the language,
Most compilers do not warn about undefined behaviour. Try posting this
on comp.lang.c and you will be told exactly the same.


> This is not
> a warning, but should be alerted as an error, don't you think so ?
> I never seen before in the C standard definition that a warning
> should imperatively be taken into account, otherwise generating an
> error. The warnings must often, used to generate good code, not errors.
> Strong optimizations mean that warnings are now considered as errors ?
>

The "C" standard has changed substancially with the introduction of C11.
I work on the Hercules project which is an IBM Mainframe emulator and we
are having the same problems...


> Please take into account that I practice C language since 1986,
> first Kernighan and Ritchie, then ANSI C standard. Notice the
> C standard must have evoluted because my C sources are now obsolete.
>

Possibly need adjustments to cope with changes...

> Thanks for your answer.
>
> Best regards,
>

Dave Wade
G4UGM
Re: GNU/GCC optimizing [message #301728 is a reply to message #301717] Sun, 04 October 2015 16:30 Go to previous messageGo to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi,

Michael Schwingen writes:
> Francois LE COAT wrote:
>> You'll agree that it's very peculiar ... How bizarre a warning will
>> generate an error when building C program's sources ? This is not
>> a warning, but should be alerted as an error, don't you think so ?
>
> No. The C standard gives compiler writers considerable room in which way
> certain details may be implemented. If your source code uses constructs with
> undefined results, the results will be - well, undefined, and may change
> with compiler versions.
>
>> Please take into account that I practice C language since 1986,
>> first Kernighan and Ritchie, then ANSI C standard. Notice the
>> C standard must have evoluted because my C sources are now obsolete.
>
> It has - K&R left many things undefined which were better specified in ANSI
> C, however, even K&R had language details which caused undefined behaviour
> by definition, and which may now cause unexpected results due to improving
> compiler optimizations that uncover faults that were always there.
>
> As a start, have a look at
> http://blog.regehr.org/archives/213
> http://blog.llvm.org/2011/05/what-every-c-programmer-should- know.html

Well, the same recipe should at least give the same meal. If somebody
cooks a pizza, he is not supposed to eat a tomato ketchup. Except if
he is an extremely bad cooker. The recipes are the C language sources,
and the cooker is the C language compiler.

What should I think about GNU/GCC 4 compared to GNU/GCC 2 and 3,
PURE C 1.1 and other compilers, building my C program Eureka 2.12,
when the produced binary is such a messy meal ?

Thanks for your answer.

Best regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #301729 is a reply to message #301727] Sun, 04 October 2015 17:35 Go to previous messageGo to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi,

David Wade writes:
> Francois LE COAT wrote:
>> Please take into account that I practice C language since 1986,
>> first Kernighan and Ritchie, then ANSI C standard. Notice the
>> C standard must have evoluted because my C sources are now obsolete.
>
> Possibly need adjustments to cope with changes...

I'm afraid if I make that changes, my Eureka 2.12 sources will only
be compatible with GNU/GCC 4, and not any other C compiler. I had to
cope on ATARI computers with Lattice C, Turbo C, PURE C, GNU/GCC
2.x then 3.x etc. I've never seen such a rigorous C compiler as
GNU/GCC 4. It produces not any error with the sources, but the
binary is a complete mess ...

I prefer thinking my sources are obsolete, and my program runs on ST :)

Thanks for your answer.

Best regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #301749 is a reply to message #301729] Mon, 05 October 2015 05:36 Go to previous messageGo to next message
Miro Kropáček is currently offline  Miro Kropáček
Messages: 23
Registered: November 2012
Karma: 0
Junior Member
> I'm afraid if I make that changes, my Eureka 2.12 sources will only
> be compatible with GNU/GCC 4, and not any other C compiler.
This is not true. On the contrary, most likely you'll discover hidden bugs in your code. And of course the source will stay compatible, as mentioned, the compiler is now more strict but that doesn't mean it doesn't adhere to the C standard.

So it boils down to the fact whether you're willing to change your code or you're stuck with 3.x forever. Obsolete sources require obsolete compilers.
Re: GNU/GCC optimizing [message #301846 is a reply to message #301728] Tue, 06 October 2015 12:38 Go to previous messageGo to next message
Anonymous
Karma:
Originally posted by: Michael Schwingen

On 2015-10-04, Francois LE COAT <lecoat@atari.org> wrote:
>
> Well, the same recipe should at least give the same meal.

Not if the recipy is unclear, like "take a spoonful of whatever red
ingredient you find in the fridge".

> What should I think about GNU/GCC 4 compared to GNU/GCC 2 and 3,
> PURE C 1.1 and other compilers, building my C program Eureka 2.12,
> when the produced binary is such a messy meal ?

That your source is a messy meal, when judged by the current C standards
that the compiler implements - for good reason (it gives much better
performance when compiling standard-compliant sources). And no, the compiler
is working as expected (by the standard) - it it giving you one kind of
correct result, just not the one you want.

If you insist on keeping your source code as it is, you are limited to old,
badly-optimizing compilers that will by chance get the results you want.

Otherwise, you need to take a look *where* the problems are (as I said,
compiling with full warnings enabled, understanding and removing them is a
good start, but deeper debugging may be required). The links I posted should
provide a good start to understand what constructs are to be avoided when
using modern compilers.

cu
Michael
Re: GNU/GCC optimizing [message #301877 is a reply to message #301846] Tue, 06 October 2015 15:51 Go to previous messageGo to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi,

Michael Schwingen writes:
> Francois LE COAT wrote:
>> Well, the same recipe should at least give the same meal.
>
> Not if the recipy is unclear, like "take a spoonful of whatever red
> ingredient you find in the fridge".
>
>> What should I think about GNU/GCC 4 compared to GNU/GCC 2 and 3,
>> PURE C 1.1 and other compilers, building my C program Eureka 2.12,
>> when the produced binary is such a messy meal ?
>
> That your source is a messy meal, when judged by the current C standards
> that the compiler implements - for good reason (it gives much better
> performance when compiling standard-compliant sources). And no, the compiler
> is working as expected (by the standard) - it it giving you one kind of
> correct result, just not the one you want.
>
> If you insist on keeping your source code as it is, you are limited to old,
> badly-optimizing compilers that will by chance get the results you want.
>
> Otherwise, you need to take a look *where* the problems are (as I said,
> compiling with full warnings enabled, understanding and removing them is a
> good start, but deeper debugging may be required). The links I posted should
> provide a good start to understand what constructs are to be avoided when
> using modern compilers.

I think the GNU/GCC 4 compiler is too restrictive. There's a lot of
other C compilers on ATARI computers, I tested many of those, which are
not so rigorous. If you want to eliminate all chance there's any
misunderstanding with the previous generation of compilers, you can
use the -pendantic option. I think GNU/GCC 4 is naturally "pedandic".

The problem is that it eliminates a large amount of old C code,
that becomes "obsolete". Obsolescence is the worse of catastrophes
that severely impacts the informatics industries, because it gives
better profits to the giants of this lucrative business.

I'm surprised that GNU foundation is encouraging that kind of business.

Thanks for your answer.

Regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #301945 is a reply to message #301877] Wed, 07 October 2015 02:55 Go to previous messageGo to next message
Anonymous
Karma:
Originally posted by: Michael Schwingen

On 2015-10-06, Francois LE COAT <lecoat@atari.org> wrote:
>
> I think the GNU/GCC 4 compiler is too restrictive. There's a lot of
> other C compilers on ATARI computers, I tested many of those, which are
> not so rigorous.

Then have a look at gcc5, which has improved optimization again - gcc 4.0 is
now 10 years old.

On small ARM systems, I have realized big improvements in code size by using
the link-time-optimization feature of gcc5.

cu
Michael
Re: GNU/GCC optimizing [message #302110 is a reply to message #301749] Thu, 08 October 2015 15:31 Go to previous messageGo to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi MiKRO,

Miro Kropáček writes:
>> I'm afraid if I make that changes, my Eureka 2.12 sources will only
>> be compatible with GNU/GCC 4, and not any other C compiler.
>
> This is not true. On the contrary, most likely you'll discover hidden bugs in your code. And of course the source will stay compatible, as mentioned, the compiler is now more strict but that doesn't mean it doesn't adhere to the C standard.
>
> So it boils down to the fact whether you're willing to change your code or you're stuck with 3.x forever. Obsolete sources require obsolete compilers.

Well, I remind you that I'm not supposed to be able to use GNU/GCC 4
to build my Eureka 2.12 program. The great advantage with GNU/GCC 3
is that it understands my sources, producing warnings that it
can tolerates. The other advantage is that it includes 16bits
libraries, that are mandatory to build my software, for compatibility
with PURE C and furthermore binary compatibility with the ATARI ST.

Should I abandon the opportunity to build an ATARI ST software to
sacrify to the modernity of GNU/GCC 4, because it's new. What
improvements does it bring to the m68k-atari-mint target ? I'm
suspecting there's more drawbacks than improvements with GNU/GCC 4.

Why doesn't it include 16bits libs, breaking backward compatibility ?

Thanks for your answer.

Best regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #302169 is a reply to message #302110] Fri, 09 October 2015 02:55 Go to previous messageGo to next message
Miro Kropáček is currently offline  Miro Kropáček
Messages: 23
Registered: November 2012
Karma: 0
Junior Member
> The great advantage with GNU/GCC 3
> is that it understands my sources, producing warnings that it
> can tolerates.
You're missing the point completely but whatever, your choice.

> Why doesn't it include 16bits libs, breaking backward compatibility ?
This is not gcc maintainers' decision. The decision has been made by Vincent who has disabled it. Mintlib doesn't support -mshort for decades, the same goes for every lib in Sparemint RPM packages. As said, -mshort support is only for the FreeMiNT kernel who has to support -mshort by definition because the TOS API is 16-bit.

So again, and this is my last post on this topic, you're having obsolete sources and that's that.
Re: GNU/GCC optimizing [message #302190 is a reply to message #302169] Fri, 09 October 2015 11:25 Go to previous messageGo to next message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi MiKRO,

Miro Kropáček writes:
>> The great advantage with GNU/GCC 3
>> is that it understands my sources, producing warnings that it
>> can tolerates.
> You're missing the point completely but whatever, your choice.
>
>> Why doesn't it include 16bits libs, breaking backward compatibility ?
> This is not gcc maintainers' decision. The decision has been made by Vincent who has disabled it. Mintlib doesn't support -mshort for decades, the same goes for every lib in Sparemint RPM packages. As said, -mshort support is only for the FreeMiNT kernel who has to support -mshort by definition because the TOS API is 16-bit.
>
> So again, and this is my last post on this topic, you're having obsolete sources and that's that.

I'll add that GNU/GCC 3 is suitable to build ATARI ST softwares.
GNU/GCC 4 is not, because it is too restrictive with its syntax,
but furthermore it doesn't implement 16bits libraries required
for ATARI ST softwares.

I don't want to loose ATARI ST compatibility, so I don't use it.

Thanks for your answers. The OS maintainers are making very strange
decisions breaking the backward compatibility with the ATARI ST !
Most of them probably never developed on the ATARI ST hardware :-(

Best regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Re: GNU/GCC optimizing [message #302276 is a reply to message #302190] Sat, 10 October 2015 05:40 Go to previous messageGo to next message
Anonymous
Karma:
Originally posted by: Michael Schwingen

On 2015-10-09, Francois LE COAT <lecoat@atari.org> wrote:
> I'll add that GNU/GCC 3 is suitable to build ATARI ST softwares.
> GNU/GCC 4 is not, because it is too restrictive with its syntax,
> but furthermore it doesn't implement 16bits libraries required
> for ATARI ST softwares.

Those are not *required*. Operating system calls do not need a special "int"
size from the compiler - they should use uint16_t/uint32_t in the C library
syscall implementation, which works just fine with any gcc version, and most
OS calls I remember actually use 32-bit ints, not 16 bit. After all, the 68k
CPU is (by architecture) a 32-bit machine.

Good, *portable* C code should work as well when compiled with 32-bit ints,
although it may run a bit slower and use more memory - how that is offset by
the better optimizations of the new compiler would need to be checked in
every case.

> Thanks for your answers. The OS maintainers are making very strange
> decisions breaking the backward compatibility with the ATARI ST !

Which OS maintainers? The last OS version that was maintained was TOS 4.08.

You are talking about one gcc package - you are free to take the source code
and compile your own version with 16-bit integer support. You are also free
to take the 16 bit libraries and adapt them to compile with gcc4 (or better
gcc5) - it can be done, I use gcc4 on 16-bit targets (avr) regularly.

If noone stepped up and did just that until now, it does not mean it is
impossible or someone decreed that 16-bit support should be dropped - just
that noone wanted it enough to invest their own time.

cu
Michael
Re: GNU/GCC optimizing [message #302281 is a reply to message #302276] Sat, 10 October 2015 07:07 Go to previous message
Francois LE COAT is currently offline  Francois LE COAT
Messages: 225
Registered: August 2012
Karma: 0
Senior Member
Hi,

Michael Schwingen writes:
> Francois LE COAT wrote:
>> I'll add that GNU/GCC 3 is suitable to build ATARI ST softwares.
>> GNU/GCC 4 is not, because it is too restrictive with its syntax,
>> but furthermore it doesn't implement 16bits libraries required
>> for ATARI ST softwares.
>
> Those are not *required*. Operating system calls do not need a special "int"
> size from the compiler - they should use uint16_t/uint32_t in the C library
> syscall implementation, which works just fine with any gcc version, and most
> OS calls I remember actually use 32-bit ints, not 16 bit. After all, the 68k
> CPU is (by architecture) a 32-bit machine.
>
> Good, *portable* C code should work as well when compiled with 32-bit ints,
> although it may run a bit slower and use more memory - how that is offset by
> the better optimizations of the new compiler would need to be checked in
> every case.
>
>> Thanks for your answers. The OS maintainers are making very strange
>> decisions breaking the backward compatibility with the ATARI ST !
>
> Which OS maintainers? The last OS version that was maintained was TOS 4.08.
>
> You are talking about one gcc package - you are free to take the source code
> and compile your own version with 16-bit integer support. You are also free
> to take the 16 bit libraries and adapt them to compile with gcc4 (or better
> gcc5) - it can be done, I use gcc4 on 16-bit targets (avr) regularly.
>
> If noone stepped up and did just that until now, it does not mean it is
> impossible or someone decreed that 16-bit support should be dropped - just
> that noone wanted it enough to invest their own time.

Sorry, but we're not speaking of contemporary developments, but choices
of implementation done 30 years ago, when GNU foundation was just
created, and when there was no internet network. It's very easy to make
good choices of implementation (usage of integer type etc.) for
actual developers, taking into account about errors of the past
programmers. But if today's programmers are correctly developing in
C language, that's because developers like me made errors in the past.
The problem is that programs like Eureka 2.12 are now obsolete. It
represents a large part of softwares developed on ATARI ST hardware.

You would have been very clever if you had given me those advices
30 years ago. For the time being, there's nothing else I can do,
except being very sad my C sources are not supported anymore, alas.

Thanks for your answer.

Regards,

--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
  Switch to threaded view of this topic Create a new topic Submit Reply
Previous Topic: Particles! BBS
Next Topic: [PUB] New at the Belgian ftp-server
Goto Forum:
  

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ] [ PDF ]

Current Time: Thu Mar 28 07:20:02 EDT 2024

Total time taken to generate the page: 0.06987 seconds