• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Programming |OT| C is better than C++! No, C++ is better than C

What? Somebody really used operator. for that?

Not yet, because C++ doesn't allow overloading of the . operator yet. You can overload the arrow operator, but not the dot operator. What I'm saying is that proposals are under consideration to allow overloading the . operator as well.

The motive is not dot products, but I mean, come on. You know people are going to use it for that if it actually makes it into the language. ;-)
 

Koren

Member
Not yet, because C++ doesn't allow overloading of the . operator yet. You can overload the arrow operator, but not the dot operator. What I'm saying is that proposals are under consideration to allow overloading the . operator as well.

The motive is not dot products, but I mean, come on. You know people are going to use it for that if it actually makes it into the language. ;-)
I see...

I never ever thought I could use . overloading, so I didn't remember they haven't gone that far.

Actually, the more I think about it, the dirtier I imagine it could be...
 

Makai

Member
hugocésar;186861281 said:
Hey everyone, not sure if this was posted already. But there's this bundle/sale going on. I'd like to give coding a shot, not sure in which format, but just in general to see if it's something for me.

https://deals.toucharcade.com/sales...e-bundle-1adc78b4-aae9-4bc9-bed4-f2f71bff233f

Is this a legit, good deal and place to start? Excuse my ignorance.
You can get everything you need for free. Just pick a language and start dicking around with it.

Websites -> HTML, CSS, and Javascript (in that order)
Games -> C# and Unity
Other -> Python
 
Cool, awesome! I'm not sure what I'd like to do yet, I very briefly tried CSS a couple weeks back and it seemed really cool. And I really like web design as well, so I'll look into it. Thanks folks.
 
It is definitely weird because nobody does it but I like the idea of simplifying what is generally a special case of function into actual functions. That's probably why so many languages have operator overloading, people want to think of them the same way but because they don't change for the most part it's too easy to expect them to be static.
This is basically the motivation behind the Scheme language. Unify as many concepts as possible and build larger concepts over those smaller, orthogonal ones. This is how R6RS Scheme (the 2011 standard) has just a handful of keywords and builds the rest of the language over macros and functions. But as Makai was saying, anyone who has ever used either Lisp or Cobol knows how wonderful syntax really is.

I guess another idea would be the reverse, users start defining their own operators. A good unicode editor could probably make this work better as you'd have actual symbols to work with.
ML family languages let you write your own operators, and this is used to great effect. As long as you tell somebody what a symbol means, there shouldn't be any confusion. After all, how many people here who don't do cryptography or embedded really remember that ^ is bitwise XOR?

It's really nice to just think of operators as "functions with names that have only special characters and go between their arguments, not in front of them". Then it makes sense to import operators, overload them for different types, make them abstract members of interfaces, pass them as arguments to other functions...
 

Koren

Member
After all, how many people here who don't do cryptography or embedded really remember that ^ is bitwise XOR?
Curious about the answer... I'd say it's common knowledge, but at the same time, I may be a bad judge since I'm still using sometimes
Code:
a^=b^=a^=b;
just to spare a temp variable ^_^

I looooooove Scala.
Really liking it too, it should become my main scripting language. And that comes from someone who don't like JVM at all...
 
I looooooove Scala.

However, I don't really see it getting that much attention. It's cursed by the JVM. :(


It's getting tons of attention in the enterprise world though, highly concurrent requirements are definitely best in a functional style and the JVM offers amazing performance on the Back end. Just look at Twitter and LinkedIn.
 

Koren

Member
That's actually slower than using a temp :)
Not in term of typing... It's at least 7 chars shorter. ^_^

Of course it is now, temp variables should be heavily optimizable now (I'm even wondering whether you could ask the processor to swap virtual registers?). Still, it used to be faster in assembly, that's actually the reason why I know the trick.

Edit: Just in case you worry, I'm not using
Code:
a^=a;
though, even if it was faster than a=0; Or at least, not outside of joke code...



I'm mostly still doing it from time to time (only for code I don't share) because:
- I'm loving it
- It's slightly easier for me to recognize it, and you can often spare a slower solution for higher readability... OK, it's a pitiful excuse... in proper code, I would write e.g. an inline void swap(...) and swap(a, b) is the best solution for readability, and also a most efficient one (and yes, I know inline is a wish, not a request). The only "true" reason is...
- I'm loving it
 

Kickz

Member
Just started my first soft dev job on Monday as a jr .Net developer.
Still in early days of getting trained on their existing code stack. One thing that has been bugging me is instead of having me shadow their senior .Net guy, they are making me work with their database guy and having me learn how to stage database migrations.. Kinda seems like what a database dev would do or something, I am afraid if I might get stuck in a "database role" and don't get to do much developement using C# on their in house webapp...

Any suggestions on this? Should I bring this up to the boss?
 

Massa

Member
Any suggestions on this? Should I bring this up to the boss?

Hard to say from the outside, but as a general rule programs exist to take data and transform it. So it could be normal that they want you to get a good grip of their data to start with.
 
Need some advice if anyone can help.

Using java and have a list of numbers (double) i'm using in an array initializer. is there a way to name each index instead of 0,1,2,3... Need to make them months of the year.
 

Kickz

Member
Hard to say from the outside, but as a general rule programs exist to take data and transform it. So it could be normal that they want you to get a good grip of their data to start with.
I guess Ill just hang with it then until they hopefully rotate me to something else. Nothing against database work but I find it boring
 
I looooooove Scala.

However, I don't really see it getting that much attention. It's cursed by the JVM. :(

What's wrong with the JVM? Lots of places use Java already and that makes going to Scala very easy, compared to switching to another runtime. Besides, the JVM has one of the best JIT compilers around, good Java code that inlines properly can compete with optimized C code. One thing that always makes me sad when using a JVM language is that you can't use primitive types for generic containers (List<int> instead of List<Integer>) but using raw arrays also sucks. I guess I've been spoiled by Rust and C++ in this regard. Having things laid out in a linear array in memory is great for performance. (if you access memory in a predictable pattern, the CPU's prefetcher can load lots of data into the cache at the same time, which makes things like array access about an order of magnitude faster than if you have to get it from RAM)
 

Koren

Member
What's wrong with the JVM?
Mostly, for me, it's a mess to manage. You have to install a JVM alongside the program itself, keep track of how many different JVM are installed and which one is used by each applications. And it's not "just install one and forget about it", since JVM bugs are a common target for malicious attackers.

I simply prefer working with native code if I can choose, even if that means several cross-compilation. That won't prevent me from using JVM, but I'm not enjoying the administration tasks that follows.

Besides, the JVM has one of the best JIT compilers around
"The" JVM? Which one, Hotspot?

About speed, a good JIT compiler will make Java compete with C written normally for many tasks (especially repetetive ones), but I'm reluctant to use comparisons with optimized C. That's usually another matter. Be it either optimized by the user, or even by the compiler.

C compilers can spend as much time as they want to optimize the code, I've seen compilers that, when provided a source code with a bubble sort compile it as a quicksort. It's a bit extreme, and take a lot of clever static analysis of the code, but this give you the idea of what a normal compiler can do and a JIT one can't...
 

zeemumu

Member
I've spent the past few weeks working on this assignment. I had to make an array-based game in Java. There are a lot of booleans and whatnot but I think I finally got all the bugs out (Player randomly turning into an enemy, Enemies flipping tiles to their true value whenever they move, etc.).

By the way, does anyone know of a decent small laptop that's good for programming work? I do most of my programming on a gaming laptop which works fine but it's a little heavy and awkward to carry around. I don't know non-gaming laptop specs.
 
I've spent the past few weeks working on this assignment. I had to make an array-based game in Java. There are a lot of booleans and whatnot but I think I finally got all the bugs out (Player randomly turning into an enemy, Enemies flipping tiles to their true value whenever they move, etc.).

By the way, does anyone know of a decent small laptop that's good for programming work? I do most of my programming on a gaming laptop which works fine but it's a little heavy and awkward to carry around. I don't know non-gaming laptop specs.

Depends on how graphically intensive your work is. If it's just to write code, most modern laptops would do.
I am using a (bargain) mac book air, which is great for portability, writing stuff and surfing the web, but the screen size is a bit of a problem sometimes (it's 11").
As always, when choosing a PC, you need to balance funds with needs.
 

zeemumu

Member
Depends on how graphically intensive your work is. If it's just to write code, most modern laptops would do.
I am using a (bargain) mac book air, which is great for portability, writing stuff and surfing the web, but the screen size is a bit of a problem sometimes (it's 11").
As always, when choosing a PC, you need to balance funds with needs.

I'm thinking somewhere in the $800 range, although I wouldn't be TOO opposed to getting a mac book air. Almost got one a few months ago. All it'll be doing is coding, essays and powerpoints, and allowing me to goof off when I feel like it. And even though it's not a gaming laptop I'd like it to be able to at least run something like WoW or Minecraft with minimal issue. Not a requirement but it'd be nice.

Edit: Program's up and running. My only problem is that, while I can get the program to save the array board by serializing the custom object that its filled with, for some reason it completely fucks up the controls because it only copies over the image of it and not the position of the objects.
 

NotBacon

Member
I've seen compilers that, when provided a source code with a bubble sort compile it as a quicksort.

Holy damn

By the way, does anyone know of a decent small laptop that's good for programming work? I do most of my programming on a gaming laptop which works fine but it's a little heavy and awkward to carry around. I don't know non-gaming laptop specs.

I'm happy with the Dell XPS 13
 

Koren

Member
Holy damn
Just to be clear: it was an experimental compiler, nowhere close to a public product. And in a specific case. It's just that "real life" optimization examples are less obvious to grasp, but the benefits of having time to do automatic optimization are there. Even if it's to rewrite the code to optimize code cache invalidations in a long pipeline because of a badly predicted jump.

Many researchers are trying to design compilers that offer -O4 flag or better. Should you be able to predict the result of a function for all the inputs, for example, you can rewrite it differently...

Let's take an example, McCarthy 91 function :
Code:
let rec f x = if x > 100 then (x-10) else f(f(x+11));;

If x>100, there's no problem, the answer is immediate.

For lower values, especially big negatives ones, it's possibly very long.

But you can analyse the function, and see that for 90 <= x <= 100, f(x) = f(x+1). Thus, for those values of x, f(x) = 91.

Then, for values 79 <= x <= 89, f(x) = f(f(x+11)) = f(91) = 91.

Same for 68 <= x <= 78. Then, obviously, for any x below.

Thus, you can rewrite the function by
Code:
let rec f x = if x > 100 then (x-10) else 91;;

Which is insanely more efficient.

Even if this reasoning isn't that hard for a human, it's difficult to program it, and optimization is an insanely hard task. But sometimes, it find efficient solutions. It's just that it's something JIT compilers can't do, because they don't have time for it.
 

Two Words

Member
I'm thinking somewhere in the $800 range, although I wouldn't be TOO opposed to getting a mac book air. Almost got one a few months ago. All it'll be doing is coding, essays and powerpoints, and allowing me to goof off when I feel like it. And even though it's not a gaming laptop I'd like it to be able to at least run something like WoW or Minecraft with minimal issue. Not a requirement but it'd be nice.

Edit: Program's up and running. My only problem is that, while I can get the program to save the array board by serializing the custom object that its filled with, for some reason it completely fucks up the controls because it only copies over the image of it and not the position of the objects.
Whatever you do, get a laptop with an SSD.
 

Somnid

Member
Even if this reasoning isn't that hard for a human, it's difficult to program it, and optimization is an insanely hard task. But sometimes, it find efficient solutions. It's just that it's something JIT compilers can't do, because they don't have time for it.

Not quite true. It won't on first run but good ones will go back over and start doing heavier optimization on functions prioritized by usage. Ideally there is no difference over a long running time.
 

Granadier

Is currently on Stage 1: Denial regarding the service game future
I have my first real technical interview in Boston on Tuesday. It's my third interview with the company, and they are flying me out for the day.

Do you all have any advice for preparing and the interview itself?
 
I have my first real technical interview in Boston on Tuesday. It's my third interview with the company, and they are flying me out for the day.

Do you all have any advice for preparing and the interview itself?

Most algorithmic / programming problems that you will encounter on an interview have different "qualities" of solution. There's the brute force solution that will have terrible performance, There might be a slightly better one that they expect most qualified candidaets to be able to arrive at, and then there's occasionally an amazing solution that many people never find (but that they don't expect you to).

Always explain the brute force solution immediately and its complexity, then mention that you're going to ignore it since there's clearly a more better algorithm. If, at any point, you are running out of time and not making substantial progress toward a better algorithm, ask the interviewer if he wants you to just code up the brute force solution on the whiteboard.

It's better to solve the problem with a subpar answer than to not solve it at all.
 

upandaway

Member
Looking for some general advice if anyone wants to help:

I want to start thinking about sort of direction I'll be going in. I've heard from lots of people (mostly academic people though) that being able to find work later in life requires either management or highly technical or academic. I don't want management at all so the paths they told me about were algorithms, embedded, cyber security, robotics and machine learning.

I like to think that my grades/maths are good enough to attempt going in that direction, but I don't really know anything about any of this. Does anyone have any advice/experience to offer?
 
Looking for some general advice if anyone wants to help:

I want to start thinking about sort of direction I'll be going in. I've heard from lots of people (mostly academic people though) that being able to find work later in life requires either management or highly technical or academic. I don't want management at all so the paths they told me about were algorithms, embedded, cyber security, robotics and machine learning.

I like to think that my grades/maths are good enough to attempt going in that direction, but I don't really know anything about any of this. Does anyone have any advice/experience to offer?

You'd hit the ceiling within a company pretty fast if you are just a programmer. The only way up after senior is usually a management position. But there are always higher tech companies who'll hire you.
 
You'd hit the ceiling within a company pretty fast if you are just a programmer. The only way up after senior is usually a management position. But there are always higher tech companies who'll hire you.

Definitely depends on the company. Most of the larger tech companies have very very high tech ladders that are entirely engineering
 

Ambitious

Member
Java Question: I have an abstract Robot class and several subclasses which implement different functionality. I need to be able to arbitrarily start any kind of robot, and I'm not quite sure about how to do this.

Currently, I just have a main method in each of the subclasses. They all look the same: Verify the arguments, create an instance, supply the arguments to the init() method and call start(). Surely, there has to be a better way?
I could put the main into the superclass, but how would I figure out which subclass to instantiate, then?

The project uses Maven, so if this can be done by using some kind of Maven goal or configuration, that's an option too.
 

Makai

Member
Java Question: I have an abstract Robot class and several subclasses which implement different functionality. I need to be able to arbitrarily start any kind of robot, and I'm not quite sure about how to do this.

Currently, I just have a main method in each of the subclasses. They all look the same: Verify the arguments, create an instance, supply the arguments to the init() method and call start(). Surely, there has to be a better way?
I could put the main into the superclass, but how would I figure out which subclass to instantiate, then?

The project uses Maven, so if this can be done by using some kind of Maven goal or configuration, that's an option too.
In the base class:
Make a generic method which takes the type as a generic parameter and does all of that stuff.
Make an abstract method

In the child class:
Override the abstract method and call the generic method with the child class' type

If init() or start() are defined in child classes, you should put those in an interface or make them abstract methods in the base class.
 

Slavik81

Member
Definitely depends on the company. Most of the larger tech companies have very very high tech ladders that are entirely engineering
They do, but my impression is that this basically means that Ken Thompson could take the technical track and be equivalent to a top-tier manager. Which is great, but you have a much better chance of becoming one of those nameless managers he's equivalent to, than somehow match the technical achievements involved in designing significant portions of Unix.

2nd year undergrad, I know I'm probably a while off but there's choosing courses etc (plus curious to know more in general)
Unless you're aiming for something specific, I wouldn't worry about your later career until you at least have a job.

My career plan was:
1. Get degree
2. Get experience
3. Get MBA / Master's
4. Look for specialist / team lead positions

I can tell you that steps 1 and 2 worked out great. We'll see how it goes from there.
 
They do, but my impression is that this basically means that Ken Thompson could take the technical track and be equivalent to a top-tier manager. Which is great, but you have a much better chance of becoming one of those nameless managers he's equivalent to, than somehow match the technical achievements involved in designing significant portions of Unix..

Nameless or famous shouldn't really be a part of it. Rather, what you should be concerned about is how many people you impact with your decisions. Obviously the higher you go up the ladder the number of people at that level begins dropping exponentially, but that doesn't mean you have to have name recognition or be famous to get there.

I know of plenty of colleagues who are nameless to the outside world but whose technical accomplishments are amazing, and whose work directly impacts an astronomical number of people. They started just like everyone else, but they found the right opportunities internally, got the right experience, and delivered. They do zero actual managing of anything, they are just very very senior engineers who the leaders go to with new product ideas.

I'm not disagreeing with you if we're talking about averages, for the simple fact that most companies don't have this kind of organizational structure or even products to make this kind of impact possible in the first place. But my point is just that it's possible at the right company. So if this kind of thing is important to you, just make it happen.
 
After messing around with Haskell for a few weeks its amazing how easy Python is to work with. :lol

And its given me a newfound appeciation for the quality of the Python ecosystem. I love functional programming and F# is probably my favourite language, but Python has so many libraries for whatever problem you can think off its kinda hard not to use it. F# at least has the .Net world to tap into, but Haskell lacks too much stuff. Still a great language to help you understand functional programming though.

I do miss my typesystem when working with Python, it just feels right to have the compiler yell at me for doing stupid shit like trying to return a string when i should return an int.
 
How the heck would I figure out programmatically if there's an even number of bits "activated" (00100010) in C++? I'm racking my brain but it's coming up with some pretty inefficient solutions.
 
How the heck would I figure out programmatically if there's an even number of bits "activated" (00100010) in C++? I'm racking my brain but it's coming up with some pretty inefficient solutions.

First you need to count the number of 1 bits.

lolsledgehammer approach: Use the popcnt intrinsic. There is literally an assembly language instruction that will compute this for you.

MSVC: __popcnt

GCC, Clang: __builtin_popcount

naive bit hacking approach: count the number of 1 bits manually. make a loop that runs while the number is not equal to 0. add the bottom-most bit, shift right.

Code:
int popcnt(unsigned n)
{
    int cnt = 0;
    while (n)
    {
        cnt += n & 1;
        n >>= 1;
    }
    return cnt;
}

fastest manual implementation: Use a lookup table.

Code:
int popcnt(unsigned n)
{
    int cnt = 0;
    cnt += popcnt_table[n & 0xFF];
    cnt += popcnt_table[n & 0xFF00];
    cnt += popcnt_table[n & 0xFF0000];
    cnt += popcnt_table[n & 0xFF000000];
    return cnt;
}

popcnt_table here is a lookup table that just contains, for each value from 0 - 255, the number of 1 bits set. You have to define this yourself. For example:

Code:
int popcnt_table[8*sizeof(char)] = {0, 1, 1, 2, 1, 2, 2, 3, ...};

The compiler might even vectorize this so it uses SIMD instructions on the lookup.

Once you have this function:

Code:
bool is_even_number_of_1_bits(unsigned n)
{
    return popcnt(n) % 2 == 0;
}
 

TheSeks

Blinded by the luminous glory that is David Bowie's physical manifestation.
Alright, this whole function/object stuff in this Javascript course is doing my head in on understanding.

I understand that you can make "data"types that have information stored in them and they're "objects".

EG:

Code:
var Object = {
  name: "blah",
  number: 500,
  bool: true
}

So, I'm starting Introduction to Objects II and am on 11 after struggling for the "review" bit earlier today.

Code:
function Person(name,age) {
  this.name = name;
  this.age = age;
}

//Let's make bob again, using our constructor
var bob = new Person("Bob Smith", 30);
var susan = new Person("Susan Jordan", 35);

//make your own class here

This is where I'm getting lost.

They want you to make a class "Circle" by building a constructor for it. The constructor for Circle should have one property "radius" and take one argument for the initial radius.

Code:
function circle(radius) {
   this.radius = radius;
}

var calculation = new circle(60);

"Oops, try again. It looks like you haven't properly defined the Circle constructor. Look at the Person constructor as a guide."

I'm not sure I'm understanding the constructor problem. "new" as a keyword is supposed to let you make a new constructor, yeah? That constructor is an "object" (hence "new Object" for filling data into)?

My function copies the same thing as the tutorial with taking the argument of radius to set the radius.

The variable calculation will make a new object named "circle" that is a function that takes the 60 argument as it's radius, yeah? Unless I'm misunderstanding?
 

Saprol

Member
maybe your homework software is being picky and wants it capitalized.

oh it's codeacademy let me look. yeah it's capital C
 

TheSeks

Blinded by the luminous glory that is David Bowie's physical manifestation.
Is that the normal convention for Objects? Capitalizing them to differentiate them from variables?
 
Top Bottom