3

I know to check for/use asserts and carefully examine any assembly components, but I didn't know if anyone out there has a fairly comprehensive or industry standard check-list of specific things at which to look? I am looking more at C and C++.

note: There are some really helpful answers, I'm just leaving the question open for a couple days in case some folks only check questions that don't have accepted answers.

ratchet freak
  • 25,706
  • 2
  • 62
  • 97
RobotHumans
  • 906
  • 8
  • 11
  • Microsoft has quite a good documentation about [Programming Guide for 64-bit Windows](http://msdn.microsoft.com/en-us/library/bb427430%28VS.85%29.aspx) – Giorgi Jan 31 '11 at 19:15

3 Answers3

7

Not a comprehensive list, but some of the things I ran into when converting a large C++ codebase:

Libraries, of course. Find all of them, and see what they've got. In my case, we decided to move to Open Source libraries for JPEG handling and data compression, and so we compile our own. Commercial libraries may give you more problems.

Pointer conversions. On 32-bit systems, usually sizeof(int) == sizeof(void *), and lots of people work on that basis. On a 64-bit system a round trip from pointer to int to pointer will frequently lose half the pointer. Your compiler should be able to help here.

Data object size values. This would be a good time to go through and change all size values to size_t, if you can find them, and pointer differences to ptrdiff_t. This isn't necessarily going to be a problem, but since we were being pushed by the size of the data we were working on I wanted to make sure it could grow as much as possible.

Object size. Any struct or class you've got with pointer members is going to grow, and if you're changing any int to size_t or ptrdiff_t. Again, this is unlikely to be a problem, but you may want to check for I/O. If you're doing any binary I/O, you really need to examine what's being input and output. If you're sending or receiving data through a fixed protocol, you need to look at it. Do a grep for union and check each one for changes.

David Thornley
  • 20,238
  • 2
  • 55
  • 82
6

One thing to be careful with is third-party libraries you depend on. Some may not have been ported to x64 or have a separate version for x86 and x64 projects, so you will have to do some extra work to manage dependencies and make sure you're building against the correct library for your target platform.


Do you have any thoughts on moving depended packages into the source tree to minimise this type of problem? For example, mplayer does this. I have noticed doing it this way often mangles some things and leads to a complicated configure script, but it seems like a valid approach.

That sounds fine to me, especially if you can automate most of not all of your dependency management for builds.

Last time I ran into this, we did something like this: we kept third-party dependencies on a share (and/or source control -- we tried both ways) and developers maintained their local environment using provided batch scripts to set it up with the right versions. The code would be written with a reference to a particular DLL name in mind. We worked in .NET, so for example our project files would reference Whatever.dll and when the build was done, we had a script that'd sub the 64-bit version in its place. If I recall, the build machine would use the same scripts the devs would use to grab the right library versions. Project files didn't need to be changed, since dll names didn't change.

Where I'm at now, we edit csproj files to include platform-specific references, so instead of the normal relative path to the dependency that Visual Studio inserts, we use the ${Platform} variable and it's controlled through project configurations -- in an x64 project, we'll have a "win64" configuration, so ${Platform} will be replaced with "win64" and the dlls would then be pulled from the "win64" subfolder of the folder that contains all our dependencies.

Adam Lear
  • 31,939
  • 8
  • 101
  • 125
0

Testing is one of the most important things to consider. Most software testing is based on "grandfathering" - functionality that used to work and has not changed will be OK for this release. What you will find with moving to 64 bit (unless your code is particularly good) is issues that have not had symptoms previously will come up - sometimes in weird and unexplained ways. Things like buffer overflows previously "contained", assumptions about data sizes causing new buffer overflows, dodgy pointer math etc.

The amount will largely depend on the quality of the code base - in one case it took a very long time to get from "finished" to "customer acceptance".

mattnz
  • 21,315
  • 5
  • 54
  • 83