On Beta Testing

Get help with compiling or installing the game, and discuss announcements of new official releases.

Moderator: Forum Moderators

User avatar
Pentarctagon
Project Manager
Posts: 5565
Joined: March 22nd, 2009, 10:50 pm
Location: Earth (occasionally)

Re: On Beta Testing

Post by Pentarctagon »

Ah, yeah that would explain it. I used the .scon_option_cache in loonycyborg's dependencies and that had:

Code: Select all

default_targets = 'test,wesnoth,wesnothd'
Also, I think I found the source of the problem with the bzip2/zlib libraries not being named correctly. Though the patch posted there is more recent than the last release of boost, so it wouldn't have been incorporated yet.

edit - Is there any reason that I would *need* to use boost 1.46.1? I've been using it since that was the version loonycyborg seemed to be using, however if I use 1.54.0 I get no warnings during build except the "no #fifodir" one which I already know can be ignored.
99 little bugs in the code, 99 little bugs
take one down, patch it around
-2,147,483,648 little bugs in the code
User avatar
iceiceice
Posts: 1056
Joined: August 23rd, 2013, 2:10 am

Re: On Beta Testing

Post by iceiceice »

Pentarctagon wrote:Is there any reason that I would *need* to use boost 1.46.1?
I'm pretty sure there is no such reason. We only officially require boost 1.36, and this is only for compatability with the Pandora, where recent versions of boost are not readily available. I always have built with the most recent version of boost available (which appears to me with the little "sun" icon next to it in the package manager), which currently is 1.54.0.
sylph
Posts: 23
Joined: October 4th, 2013, 11:37 am

Re: On Beta Testing

Post by sylph »

It sounds like you are taking C++ code and using a compiler+assembler+linker to convert the C++ code into 32-bit machine language and then taking 32-bit machine language and interpreting it into 64-bit machine language at run time so it will run on a 64-bit x86 processor. If that is the case, why not the x86 assembly code output from a plain compiler and then using a 64-bit x86 assembler+linker to assemble the assembly code into 64-bit, x86 machine code?
If I spoke the truth, they would put me in a straitjacket. So, I left the society.
User avatar
Pentarctagon
Project Manager
Posts: 5565
Joined: March 22nd, 2009, 10:50 pm
Location: Earth (occasionally)

Re: On Beta Testing

Post by Pentarctagon »

What?
99 little bugs in the code, 99 little bugs
take one down, patch it around
-2,147,483,648 little bugs in the code
User avatar
iceiceice
Posts: 1056
Joined: August 23rd, 2013, 2:10 am

Re: On Beta Testing

Post by iceiceice »

sylph wrote:... then taking 32-bit machine language and interpreting it into 64-bit machine language at run time so it will run on a 64-bit x86 processor.


Hmm I could be wrong but I don't think this is exactly what happens... in x86-64 processor there is a backwards compatibility mode that executes 32 bit code natively, there is no "interpreting" going on afaik. See also diagram here: http://en.wikipedia.org/wiki/X86-64#Long_mode
sylph wrote: If that is the case, why not the x86 assembly code output from a plain compiler and then using a 64-bit x86 assembler+linker to assemble the assembly code into 64-bit, x86 machine code?
I *think* that we don't currently build a special 64 bit executable to release for windows... but you are welcome to do this for yourself. Idk what technical difficulties are involved though, you might need to recompile all libraries for 64 bit.

Edit: Also, I don't know what, if any, advantages there are to compiling wesnoth in 64 bit mode. Afaik that won't make it run faster, just help to access more memory at once. But wesnoth is not particularly memory intensive so I'm not sure if that justifies the effort to do this.
User avatar
Pentarctagon
Project Manager
Posts: 5565
Joined: March 22nd, 2009, 10:50 pm
Location: Earth (occasionally)

Re: On Beta Testing

Post by Pentarctagon »

loonycyborg wrote:Your NLS failures are due to absense of msgfmt.exe in %PATH% and won't affect compilation of .cpp files, it's needed only for generation of message catalogs, the .mo files.
Forgot about this, but I suppose it's worth noting that the .mo files don't seem to come as part of the normal "git clone". So if someone did have msgfmt.exe in their PATH, it would cause the build to fail.
99 little bugs in the code, 99 little bugs
take one down, patch it around
-2,147,483,648 little bugs in the code
sylph
Posts: 23
Joined: October 4th, 2013, 11:37 am

Re: On Beta Testing

Post by sylph »

Pentarctagon wrote:What?
I am wondering about the process of compilation in general. My compilation experience consists of clicking the compile button in eclipse for a homework assignment. I learned, in a classroom, that a compiler is something that takes an HLL and put it into an assembly language usually via p-code. The definition of the term 'compiler' that i learned is not the definition that is in the dictionary.

From what I learned, you should take the p-code and put it into different assembly languages. you are not manipulating assembly, you are manipulating C++. I am trying to figure out why in more detail.
iceiceice wrote:Hmm I could be wrong but I don't think this is exactly what happens... in x86-64 processor there is a backwards compatibility mode that executes 32 bit code natively, there is no "interpreting" going on afaik. See also diagram here:http://en.wikipedia.org/wiki/X86-64#Long_mode
I did not know about long mode. It seems long mode is more efficient than interpreting from 32-bit machine code to 64-bit machine code. So, typically the hardware will execute the 32-bit machine code. The WOW64 environment can interpret the instructions.

Upon further goggling:

Another technical difficulty is probably that if you took p-code and assembled it on 64-bit Windows it would be trapped by the Common Language Runtime.

The definition of 'interpret' that I have used means monitor the code as it is being executed and change some of it prior to executing it. According to wikipedia, this use of the term 'interpret' is wrong. And it should be considered a strict subset of 'just-in-time compilation'.

According to wikipedia, 'p-code' is an alternate name for 'Bytecode'. Still, I am pretty sure that the people who designed the HLLs and p-code to be platform independent (portable) would use the term p-code for "portable code".
If I spoke the truth, they would put me in a straitjacket. So, I left the society.
User avatar
Pentarctagon
Project Manager
Posts: 5565
Joined: March 22nd, 2009, 10:50 pm
Location: Earth (occasionally)

Re: On Beta Testing

Post by Pentarctagon »

I'm pretty sure that Wesnoth isn't compiled into p-code. p-code/bytecode is more akin to the result of compiling Java rather than c++.
99 little bugs in the code, 99 little bugs
take one down, patch it around
-2,147,483,648 little bugs in the code
User avatar
ancestral
Inactive Developer
Posts: 1108
Joined: August 1st, 2006, 5:29 am
Location: Motion City

Re: On Beta Testing

Post by ancestral »

Something we were discussing on IRC.

Wesnoth has a Jenkins CI server. Basically, software that automatically builds binary releases whenever there is a source code change. It sounds like there are a few VMs building for Linux and Windows right now — and theoretically one of them could build for OS X too.

I don't think it would be too hard to set up a deploy command to a GitHub repo after it's built. Beta testers just need to clone that repo instead of the whole source codebase and git pull whenever a developer pushes up a change. Testers wouldn't need to worry about building binaries.
Wesnoth BestiaryPREVIEW IT HERE )
Unit tree and stat browser
CanvasPREVIEW IT HERE )
Exp. map viewer
User avatar
Iris
Site Administrator
Posts: 6798
Joined: November 14th, 2006, 5:54 pm
Location: Chile
Contact:

Re: On Beta Testing

Post by Iris »

ancestral wrote:Wesnoth has a Jenkins CI server. Basically, software that automatically builds binary releases whenever there is a source code change. It sounds like there are a few VMs building for Linux and Windows right now — and theoretically one of them could build for OS X too.
That would require us to run Apple OS X on a VM on non-Apple hardware, which IIRC is not allowed by its software license. Now, if you know of a suitable toolchain for cross-compiling including all the Apple-specific libraries we need without violating any license terms...

(In case somebody feels tempted to bring it up, “nobody will notice” isn’t a valid argument against being cautious in this case.)
Author of the unofficial UtBS sequels Invasion from the Unknown and After the Storm.
User avatar
Pentarctagon
Project Manager
Posts: 5565
Joined: March 22nd, 2009, 10:50 pm
Location: Earth (occasionally)

Re: On Beta Testing

Post by Pentarctagon »

Could that cause any issues with bandwidth if a lot of people start downloading the binary for every change?
99 little bugs in the code, 99 little bugs
take one down, patch it around
-2,147,483,648 little bugs in the code
User avatar
Iris
Site Administrator
Posts: 6798
Joined: November 14th, 2006, 5:54 pm
Location: Chile
Contact:

Re: On Beta Testing

Post by Iris »

Yes. That’s also why we keep our Git repository clone tarball hidden from plain view.
Author of the unofficial UtBS sequels Invasion from the Unknown and After the Storm.
User avatar
iceiceice
Posts: 1056
Joined: August 23rd, 2013, 2:10 am

Re: On Beta Testing

Post by iceiceice »

Warning: Wild speculation will now ensue.

What about setting up scons to work on macs, the same way we got it to work on windows. Don't macs come with gcc anyways? I would think that would make it a pretty easy alternative to Xcode. (Of course it all depends on the available libraries, about which I have no clue.)

I only bring it up because iirc some on the irc channel were of the opinion that Xcode is not sufficiently user friendly.
User avatar
ancestral
Inactive Developer
Posts: 1108
Joined: August 1st, 2006, 5:29 am
Location: Motion City

Re: On Beta Testing

Post by ancestral »

ancestral wrote:…and theoretically one of them could build for OS X too.
shadowm wrote:Now, if you know of a suitable toolchain for cross-compiling including all the Apple-specific libraries we need without violating any license terms…
iceiceice wrote:What about setting up scons to work on macs, the same way we got it to work on windows. Don't macs come with gcc anyways? I would think that would make it a pretty easy alternative to Xcode. (Of course it all depends on the available libraries, about which I have no clue.)
In case I wasn't clear, yes, cross-compiling should be possible. (Of course, it wouldn't be easy the first time.) There are other open source projects which do this with a C++ source code base — fairly certain OpenTTD is one example. The outdated CompilingWesnothOnMacOSX wiki page also references building without Xcode.

Mac version aside, a Jenkins deployment to a GitHub repo for Linux or Windows builds would be very doable. if you didn't want to have every build deployed on every SCM change, you could set up a deploy script to run (and then we'd have nightlies!) which would make a huge difference for testers. (Seeing it's version control, you should just be able to push up the changes each night instead of sending the whole project each time).
Wesnoth BestiaryPREVIEW IT HERE )
Unit tree and stat browser
CanvasPREVIEW IT HERE )
Exp. map viewer
optimother
Posts: 76
Joined: July 12th, 2014, 4:09 am

Re: On Beta Testing

Post by optimother »

clang is default compiler on Macs and also is built with cross-compilation in mind: http://clang.llvm.org/docs/CrossCompilation.html But I haven't tried it specifically only used llvm for other things and heard about clang a lot
Post Reply