• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Programming |OT| C is better than C++! No, C++ is better than C

I told a joke in my sprint planning session.

The developers were having a discussion about Maven and I said:

"Maven? I use Maven."

They looked at me funny because I'm a PM, I don't shit about coding. then I said:

"Maven Beacus Teaches Typing, I use it all the time."

That's funny right? Right?

BRB posting this on project managers from hell dot com
 
Hey guys,

I'm looking for someone to help me out with a couple of things regarding a Web-based php / java racing game (like a manager type, not actual user-racing) I've been working on for months.

Heads up though, I'm a complete noob and self-taught guy regarding this so I probably do some things very wrong but I'm always open to suggestions.

Since I have a ton of different questions regarding several aspects of the game I'm stuck on, I'd rather take it to chat / PMs as I don't want to spam the thread about it but any help would be welcome.

Thanks !

Just post your questions here, you're much more likely to get a faster response, people might see specfic things they can help out with even if they don't realise right now that they can and other people reading can maybe learn something too. Don't worry about spamming the thread, that's sort of what it's for.
 

Staab

Member
Just post your questions here, you're much more likely to get a faster response, people might see specific things they can help out with even if they don't realize right now that they can and other people reading can maybe learn something too. Don't worry about spamming the thread, that's sort of what it's for.


Okay, I'll try but it will be confusing without a lot of context :p
Here goes nothing :

I'm using a php framework of a management game I found way back (DK Framework, an old thing from like 2007), which I heavily modified (savagely, at that, with almost no knowledge of PHP, just a lot of trial and error) to suit my needs.
Now my end goal is to have a Formula 1 simulation game, where you manage your team and hire staff, build a car, etc... to compete against other players.
The end result of that is a race that occurs twice a week where you plan your strategy ahead of time and get to see it unfold in real time (without being able to interact with it) :
this is how it looks and works right now.

Now, this is based on something I found online, free of use, called "Path Animator" (see here for a demo page and here to see how it works).
The gist of it, is that it uses a path drawn in vector format (SVG), which I then apply over a background image (a race track) and then send that path to the Javascript, with an indication of time (in seconds) and, somehow, I managed to turn that into several cars taking input from an array and it builds a race... (don't ask me how I made it with almost no knowledge of java either...).

My first (of many) question is this :
could anyone take a look at the Path Animator (or preferably my modified version of it) and tell me if it can be adjusted to a) using images (like a little car) instead of a CSS rendered circle and b) using more than 1 vector path (for the pit-lane, for example) or, more importantly, if I can split it up into 3 sectors per track, as to make the racing more dynamic (variations in sector times will be much bigger than overall variations per lap).
That would mean sending 3 vector paths to the JS with 3 timings, through an array of data.

It must be hugely confusing, hence why I would have preferred to chat with someone knowledgeable but thanks for the read :p
 
For the formula car image, you could use CSS property background-image to display the car instead of the circle. Something inlines of:

Code:
.walker {
 position: absolute;
 z-index: 1;
 width: 12px;
 height: 12px;
 background-image: url("relative/path/to/formulacar.png");
 background-repeat: no-repeat;
 display: block;
}

Oh yeah, and JavaScript !== Java :)
 

upandaway

Member

Granadier

Is currently on Stage 1: Denial regarding the service game future
Oh yeah, and JavaScript !== Java :)

What!?

T4BrSwO.gif
 
But...it's in the name!

Has anybody had any experience using Prolog? We're using it for a class I'm in now. We probably won't get too in depth with it, but I'd like to know more about it from people with experience since it's so weird.

First order logic. It's interesting from a theoretical point of view and is still regarded as an important part of a computer science education, but has seen little practical use outside expert systems.

It strikes newcomers as weird if they first come to it from an imperative paradigm (most programming languages are imperative: sequences of instructions). Before Prolog, ideally a student is first exposed to a diverse range of paradigms, especially functional programming where a declarative style comes naturally.
 

Krejlooc

Banned
does anybody publish a printed version of the OpenGL 4.5 reference manual? I'm going to be going through an extended period without internet at home due to a house fire and I need to refer to the reference manual quite a bit. I have a super old version of the reference manual from 1995 for release 1, but it's changed considerably. Typically, I'll just pull up the man pages but that won't be an option short term.

I've been looking online - I can find plenty of books that walk you through openGL but nothing that is a straight up reference manual. Any idea if such a book has been published?
 
First order logic. It's interesting from a theoretical point of view and is still regarded as an important part of a computer science education, but has seen little practical use outside expert systems.

It strikes newcomers as weird if they first come to it from an imperative paradigm (most programming languages are imperative: sequences of instructions). Before Prolog, ideally a student is first exposed to a diverse range of paradigms, especially functional programming where a declarative style comes naturally.
Functional programming is coming after Prolog lol. I guess what seems so weird about it is how useless it feels. AI is a good fit, but that's it?

does anybody publish a printed version of the OpenGL 4.5 reference manual? I'm going to be going through an extended period without internet at home due to a house fire and I need to refer to the reference manual quite a bit. I have a super old version of the reference manual from 1995 for release 1, but it's changed considerably. Typically, I'll just pull up the man pages but that won't be an option short term.

I've been looking online - I can find plenty of books that walk you through openGL but nothing that is a straight up reference manual. Any idea if such a book has been published?

I don't have an answer, but my condolences to you after a house fire :(
 

Krejlooc

Banned
I don't have an answer, but my condolences to you after a house fire :(
Thanks, I lost absolutely nothing in the fire so I'm lucky, but it burned through a bunch of utility connections. I spoke with AT&T and they won't be able to get some people out to run new lines for like 2-3 weeks, which is ridiculous. In the meantime, I can't really just stop working, especially when I'm about to switch my project over from DX to OGL. I basically need the reference manual handy.

I can always pull it up on my phone, but that is really annoying. I've always wanted a printed reference manual anyways, this is a nice exigence for me to actually buy one... but it seems nobody prints one :(
 
Thanks, I lost absolutely nothing in the fire so I'm lucky, but it burned through a bunch of utility connections. I spoke with AT&T and they won't be able to get some people out to run new lines for like 2-3 weeks, which is ridiculous. In the meantime, I can't really just stop working, especially when I'm about to switch my project over from DX to OGL. I basically need the reference manual handy.

I can always pull it up on my phone, but that is really annoying. I've always wanted a printed reference manual anyways, this is a nice exigence for me to actually buy one... but it seems nobody prints one :(

Small program to pull text from webpages into a pdf then local university print shop/kinkos? Not really elegant, but probably cheaper and you still get a hard copy.
 

Ledbetter

Member
Any good language to start learning how to make user interfaces?

I already finished my first year at my college and I have only learned C and C++. I have a class where I have to make a project using File Structures (which I find a slighty easier than Data Structures), and while my professor is ok with doing the project on console, I want to see if I can make it with user interfaces.

I've heard about Java and C# and I think they're quite similar, but I don't know where to start.
 
Functional programming is coming after Prolog lol. I guess what seems so weird about it is how useless it feels. AI is a good fit, but that's it?

Prolog is extremely useful for learning logic programming which is itself useful for several types of AI yes but also very useful for NLP systems / any rule based systems and also very very good for constaint satisfaction problems. There's something amazing about encoding a couple of simple rules and just having the result of the computation emerge from unification.

Most production systems that are/were built with Prolog use it as a logic core and the rest of the more standard stuff tends to be implemented in something else. It was a fairly common pattern to run a prolog instance as a service, write a UI or some other scaffolding around it in something else and hit the Prolog service for solutions to the core problem you were solving. Sort of like a database only using inference for computations rather than retrieving relational data.

Prolog might be largely outdated at this point but the core principles of logic programming with facts and rules are still very useful for many domains. There are many unification engines out there for most popular languages that are well worth checking out, like Pyke (python), core.logic (Clojure) or Prolog for Java. There are also bindings for just about any other language you can think of for interacting with different Prolog distributions, it's still something that people can use it's just a tool most people don't really understand.
 

survivor

Banned
Functional programming is coming after Prolog lol. I guess what seems so weird about it is how useless it feels. AI is a good fit, but that's it?
That sounds like my school, we had a class that taught Prolog first half and then Scheme in second half of the semester.

I like Prolog for how different it was and the backtracking feature can get confusing, but sometimes awesome in how it works. Functional programming with Scheme on the other hand was more enjoyable. Sure Scheme becomes just brackets hell when the assignments get a little more complex, but I much prefer to think in terms of everything is a function which is challenging but fun too.
 

Haly

One day I realized that sadness is just another word for not enough coffee.
I learned a little PROLOG for my AI class.

Made an expert system that chooses DOTA heroes for you depending on your preferences. It was kind of neat.
 

Chris R

Member
I learned a little PROLOG for my AI class.

Made an expert system that chooses DOTA heroes for you depending on your preferences. It was kind of neat.

༼ つ ◕_◕ ༽つ GIVE Program ༼ つ ◕_◕ ༽つ
 

Haly

One day I realized that sadness is just another word for not enough coffee.
that's more exciting than all the boring assignments we had to do for that class in prolog
My favorite assignment from that class was creating an AI for playing Othello. Mine scored like a 99% win rate versus a completely random strategy, but after going over it a few more times, I realized it probably broke some rules in the process.

teehee
༼ つ ◕_◕ ༽つ GIVE Program ༼ つ ◕_◕ ༽つ

I don't remember how to get this running but here's the zip file for the entire assignment:
http://www.mediafire.com/download/5m27j6bejfbzgpl/expert.zip
 

Saprol

Member
that's more exciting than all the boring assignments we had to do for that class in prolog
My functional programming class was split between Haskell and Prolog. For Prolog, we wrote a Sudoku solver and an AI assistant for the board game Clue. Sadly, I didn't iron out all the bugs so I couldn't track all the game data when we had to play each other on the last day of lectures.
 

Slavik81

Member
I told a joke in my sprint planning session.

The developers were having a discussion about Maven and I said:

"Maven? I use Maven."

They looked at me funny because I'm a PM, I don't shit about coding. then I said:

"Maven Beacus Teaches Typing, I use it all the time."

That's funny right? Right?
I lol'd.
 

Granadier

Is currently on Stage 1: Denial regarding the service game future
༼ つ ◕_◕ ༽つ GIVE Program ༼ つ ◕_◕ ༽つ

My favorite assignment from that class was creating an AI for playing Othello. Mine scored like a 99% win rate versus a completely random strategy, but after going over it a few more times, I realized it probably broke some rules in the process.

teehee


I don't remember how to get this running but here's the zip file for the entire assignment:
http://www.mediafire.com/download/5m27j6bejfbzgpl/expert.zip

99.9% chance of Techies.
 
does anybody publish a printed version of the OpenGL 4.5 reference manual? I'm going to be going through an extended period without internet at home due to a house fire and I need to refer to the reference manual quite a bit. I have a super old version of the reference manual from 1995 for release 1, but it's changed considerably. Typically, I'll just pull up the man pages but that won't be an option short term.

I looked for PDFs but as I don't know the topic at all this may not be much use:

OpenGL 4.5 Specification (Core Profile)

https://www.opengl.org/registry/doc/glspec45.core.pdf

If you want to print it, it may be costly because it runs to 825 pages.
 
Are they any online courses or tools for learning AI coding? My program didn't offer it so I want to learn it on my own time.

Sebastian Thrun and Peter Norvig are two giants of modern AI. I did the original free Stanford online course on which this is based in 2011. The Udacity version is also free. Thrun founded Udacity, the company that hosts it. It's a good introduction to the field and will give you practical tools and techniques useful to a typical software developer.

https://www.udacity.com/course/cs271
 

Sharp

Member
Iirc it wasn't the lack of traditional inheritance which bugged me the most but this. When you expect stuff to just work because of polymorphism you need (or needed?) to jump through all kinds of hoops.
I have gotten quite proficient at figuring out how to produce memory errors from seemingly innocuous situations the borrow checker complains about :) Anyway, after a while you internalize the rules and it stops being an issue (that is, after a while you usually only see borrow check errors when they're saving you).
 
The way you frame it makes it seem a lot more useful than my professor did. Well, thanks for the info and I look forward to seeing what comes of it.

Worst case scenario you learn a new paradigm for approaching problems and you can learn to identify where and when a logic programming solution might be a superior approach in the real world.

Are they any online courses or tools for learning AI coding? My program didn't offer it so I want to learn it on my own time.

These aren't online resources really but 2 books that are worth checking out that I recommend from personal experience are Artificial Intelligence: A Modern Approach, which is mostly theory oriented but covers most topics in enough detail and clarity that you can deep dive more on your own, and Artifical Intelligence for Games which is very practically oriented but focused on a smaller subset of the possible topics (mainly decision making/planning and path planning type stuff).
 

JesseZao

Member
Sebastian Thrun and Peter Norvig are two giants of modern AI. I did the original free Stanford online course on which this is based in 2011. The Udacity version is also free. Thrun founded Udacity, the company that hosts it. It's a good introduction to the field and will give you practical tools and techniques useful to a typical software developer.

https://www.udacity.com/course/cs271

Worst case scenario you learn a new paradigm for approaching problems and you can learn to identify where and when a logic programming solution might be a superior approach in the real world.



These aren't online resources really but 2 books that are worth checking out that I recommend from personal experience are Artificial Intelligence: A Modern Approach, which is mostly theory oriented but covers most topics in enough detail and clarity that you can deep dive more on your own, and Artifical Intelligence for Games which is very practically oriented but focused on a smaller subset of the possible topics (mainly decision making/planning and path planning type stuff).

Will check this stuff out, thanks!
 

Husker86

Member
Has anyone made an Android app using the Toolbar (in place of the recently deprecated ActionBar)?

I'm not able to get any of the shadow edge effects on the different "layers". If you select "Navigation Drawer Activity" when making a new project, the generated code uses the deprecated ActionBar, but it has the shadow effects. I made a Navigation Drawer Activity from scratch, following a tutorial, and it looks fine except for lacking the shadows.

I would say it has to do with the theme (can't use Material theme unless you're only building for API 21), but Google's generated activity that uses the ActionBar uses AppCompat Theme and it has the shadow.

Just a bit confused since I assume Google wants people to move over to Toolbar, but it seems like these effects are lost when doing so. The Play Store certainly looks like it uses Toolbar in place of ActionBar, but it has the shadow effects.

Hope that made sense.
 
The way you frame it makes it seem a lot more useful than my professor did. Well, thanks for the info and I look forward to seeing what comes of it.

Put me in the utterly useless category. Haskell has a lot of the same paradigms and is well supported and actively developed with better library support, a rich community, more learning materials, etc
 

Nesotenso

Member
In C shouldn't I be able to get the max value for a unsigned long integer if I multiply 1024L cubed with 4L (4294967295) ? But I seem to get an overflow warning when I print it out.
 
In C shouldn't I be able to get the max value for a unsigned long integer if I multiply 1024L cubed with 4L (4294967295) ? But I seem to get an overflow warning when I print it out.

Code:
#include <limits.h>
const uint32_t max_value = UINT_MAX;

4 * 1024^3 = 4294967296

UINT_MAX = 4294967295 = 4 * 1024^3 - 1
 
In C shouldn't I be able to get the max value for a unsigned long integer if I multiply 1024L cubed with 4L (4294967295) ? But I seem to get an overflow warning when I print it out.

You're forgetting 0. While 32 bits give you a sequence of 2 to the 32 unsigned integers, that sequence starts at 0. So the max int is 1 less than you expected. It cannot be expressed as a multiple of 2 because it isn't an even number. You can't even get to it by subtracting 1 from your computation above (though that would a valid mathematical identity) because of the same intervening overflow when using 32-bit unsigned arithmetic.
 
Put me in the utterly useless category. Haskell has a lot of the same paradigms and is well supported and actively developed with better library support, a rich community, more learning materials, etc

Must learn Haskell. In principle it's a perfect fit for an old Schemer like me. I miss the brackets, though.
 
Put me in the utterly useless category. Haskell has a lot of the same paradigms and is well supported and actively developed with better library support, a rich community, more learning materials, etc

In terms of language sure, Prolog is effectively dead for practical purposes. There are some core concepts of LP though that do not map directly to FP, and those are still worth learning.

Must learn Haskell. In principle it's a perfect fit for an old Schemer like me. I miss the brackets, though.

Have you looked into Clojure? It's basically Scheme with a good library of immutable data structures and better support for things like destructuring and pattern matching.
 

injurai

Banned
Haskell is neat. Learning it in free time right now. Ocaml though seems to be even more powerful, albeit less strict. A lot of languages though are implementing functional programming.

OMeta is an interesting research language. It's sort of designed to allow programmers to write a new language to express each thing they want to implement. Need perl? write it in OMeta instead of assembly or c.
 
Haskell is neat. Learning it in free time right now. Ocaml though seems to be even more powerful, albeit less strict. A lot of languages though are implementing functional programming.

OMeta is an interesting research language. It's sort of designed to allow programmers to write a new language to express each thing they want to implement. Need perl? write it in OMeta instead of assembly or c.

Not sure id call OCaml more powerful than Haskell, but if you like OCaml, then definitely try F#. It's a much better version of OCaml
 

Nesotenso

Member
You're forgetting 0. While 32 bits give you a sequence of 2 to the 32 unsigned integers, that sequence starts at 0. So the max int is 1 less than you expected. It cannot be expressed as a multiple of 2 because it isn't an even number. You can't even get to it by subtracting 1 from your computation above (though that would a valid mathematical identity) because of the same intervening overflow when using 32-bit unsigned arithmetic.

Yeah thanks for that. Forgot about zero.

So the left most bit is always the sign, so that is why even if you subtract one you get error because of the carry one into the last bit?
 
Did anyone here take an Operating Systems class? Im supposed to implement malloc and free in C with sbrk and mmap is not allowed. I have linked list idea in mind (for the free list) but im not quite sure how to make malloc or free work. Ive done the readings for the assignment but im still hesitant to code anything. Ive failed this class last semester so redoing it again is a bit daunting to me. If anyone could give any pointers, that would be great.
 

injurai

Banned
Did anyone here take an Operating Systems class? Im supposed to implement malloc and free in C with sbrk and mmap is not allowed. I have linked list idea in mind (for the free list) but im not quite sure how to make malloc or free work. Ive done the readings for the assignment but im still hesitant to code anything. Ive failed this class last semester so redoing it again is a bit daunting to me. If anyone could give any pointers, that would be great.

hehe

The way malloc and calloc should be viewed, is essentially like arrays. You can make them arrays of any of the primitive types.

Code:
int arr[10] = {0};
This is essentially syntactic sugar. However it created on the stack as opposed to calloc and malloc. So this memory has the potential to go out of scope, you don't have to manage memory on the stack.

Malloc and Calloc are almost the same thing. The functional difference between them is that malloc instantiates the array to 0. You there are ways to instantiate calloc to zero, but due to compilers it may set everything to 0 twice, hence malloc is a dedicated call to achieve that effect. Because malloc and calloc put memory on the heap, it will persist until the program crashes, exits, or when you manually free the memory.

Code:
int* ptr = (int *) malloc (MAXELEMS * sizeof(int));

Here we get functionally an array of integers. Malloc returns type (void *) so we cast it to be of type (int *). The argument will be the size of memory that we are allocating. We have to do a bit of math to achieve this number but thankfully sizeof makes that easy. Sizeof( primitive type) will return the size of the primitive type. We multiply that by the amount of elements we desire. Casting to (int *) is what will allow us to use pointer arithmetic to access each element.

Code:
int* ptr = (int *) malloc (MAXELEMS * sizeof(int));

int* pos = ptr;
int i;

for (i = 0; i < MAXELEMS; i++) {
printf("Element %i is %i\n", i, *pos)
pos++;
}

So every time we increment pos by 1, the compiler is actually incrementing it the sizeof(int) which allows us to keep position in the array.

Calloc works exactly the same, but it makes the arguments less messy.

Code:
char** ptr = (char **) calloc (MAXELEMS, sizeof(char*));

Notice here I'm making an array of strings. So in either malloc or calloc you cast to a pointer of the type that the array is storing internally.

Once we know we no longer need the array, we should free the memory. It's good programming practice even if we know our machine has plenty of memory.

This call is merely

Code:
free(ptr);

This is really all there's to it. You can also use this with structs.

Code:
struct link {
int x;
char y;
struct link* next;
};

struct link* start = (struct link *) malloc (1 * sizeof(struct link));

I tested none of this code, so feel free to ask questions, use your own judgement, and look up the concepts I covered.
 
Yeah thanks for that. Forgot about zero.

So the left most bit is always the sign, so that is why even if you subtract one you get error because of the carry one into the last bit?

Not sure i agree with this. If you add 1 to UINT_MAX you get 0, then if you subtract 1 you get UINT_MAX again. What part doesn't work?
 
Yeah thanks for that. Forgot about zero.

So the left most bit is always the sign, so that is why even if you subtract one you get error because of the carry one into the last bit?

No, the sign only plays a part in signed arithmetic.

This is about unsigned arithmetic. The same principles apply, though, because if you have a number system and define its limit with reference to out-of-bound values, you cannot use the number system to compute the bounds.
 
Top Bottom