• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Programming |OT| C is better than C++! No, C++ is better than C

MNC

Member
Need to learn some C for work in just a few days. I had to follow a crash course in C this past weekend; anything in particular I should learn about C? Any required reading? I only followed a fundamentals course, which sadly 50% of it covered basic programming which I don't need.
 

Slavik81

Member
I'm trying to use python difflib to get readable output, like GNU diff. It seems impossible.

Just trying to print unified_diff's output via sys.stdout.write_lines give me:
Code:
--- output.txt
+++ expected_1.txt
@@ -1,6 +1,7 @@
 a r c+  c a r@@ -12,6 +13,7 @@
 d o g+  g o d@@ -20,10 +22,12 @@
 o t s+  s t o p+  t o p

Which is unreadable as compared to:
Code:
--- expected_1.txt	2015-02-19 19:26:09.258190331 -0700
+++ expected_2.txt	2015-02-22 18:16:59.765112079 -0700
@@ -1,4 +1,4 @@
-arc car
-bed
-dog god
-pots stop tops
+ehsypv hpesvy ysepvh
+gzuonx uozgnx uzngxo xznoug
+wounxppzu zpwuonxpu
+xrqryptcb
(not the same diff, but you get the point)

I threw away my diff code and just called out to diff via subprocess, but I'd like to support Windows if it's easy.
 
I'm trying to use python difflib to get readable output, like GNU diff. It seems impossible.

Just trying to print unified_diff's output via sys.stdout.write_lines give me:
Code:
--- output.txt
+++ expected_1.txt
@@ -1,6 +1,7 @@
 a r c+  c a r@@ -12,6 +13,7 @@
 d o g+  g o d@@ -20,10 +22,12 @@
 o t s+  s t o p+  t o p

Which is unreadable as compared to:
Code:
--- expected_1.txt	2015-02-19 19:26:09.258190331 -0700
+++ expected_2.txt	2015-02-22 18:16:59.765112079 -0700
@@ -1,4 +1,4 @@
-arc car
-bed
-dog god
-pots stop tops
+ehsypv hpesvy ysepvh
+gzuonx uozgnx uzngxo xznoug
+wounxppzu zpwuonxpu
+xrqryptcb
(not the same diff, but you get the point)

I threw away my diff code and just called out to diff via subprocess, but I'd like to support Windows if it's easy.

Is it having a problem with newlines? Try writing a file instead of to sys.stdout, then running unix2dos on it and see if it looks better.

OTOH, if this doesn't work on any platform, then I'm not sure. I'd be surprised if the library was just flat out busted though, seems like something for StackOverflow.
 
Need to learn some C for work in just a few days. I had to follow a crash course in C this past weekend; anything in particular I should learn about C? Any required reading? I only followed a fundamentals course, which sadly 50% of it covered basic programming which I don't need.
Blindly, I think Zed Shaw's WIP Learn C the Hard Way covers many of the essentials well, and covers usage of wonderful debug tools like Valgrind. The one catch is that it does have a UNIX slant, but if you're planning on writing native code for Windows, you're generally better off working with C++ and Microsoft's development tools than dealing with Microsoft's dated C implementation and libraries or a UNIX-like compatibility layer like Cygwin or MinGW.

If you're at liberty to share, it would be better to have a clear idea about what domain you're trying to approach with C, because C's the best choice for a very specific, high performance subdomain of application programming, and OS kernel programming. If you're interested in going "slightly faster than scripted languages", there's other alternatives out there that might be less thorny in the long run.
 

Slavik81

Member
Is it having a problem with newlines? Try writing a file instead of to sys.stdout, then running unix2dos on it and see if it looks better.

OTOH, if this doesn't work on any platform, then I'm not sure. I'd be surprised if the library was just flat out busted though, seems like something for StackOverflow.
It doesn't work on Linux, either, unfortunately. Here's the code to reproduce it:
Code:
#!/usr/bin/env python
import sys
import difflib

output='''
arccar
bed
doggod
potsstoptops
'''

expected='''
arc car
bed
dog god
pots stop tops
'''
  
diff = difflib.unified_diff(output, expected, 'output.txt', 'expected_1.txt')
sys.stdout.writelines(diff)

EDIT: fixed it. difflib will accept a string as input, but then it ignores newlines. To get reasonable output, you actually need to give it a list of strings, each ending with a newline.
Code:
#!/usr/bin/env python
import sys
import difflib

output='''
arccar
bed
doggod
potsstoptops
'''

expected='''
arc car
bed
dog god
pots stop tops
'''
  
diff = difflib.unified_diff(
  output.splitlines(True), expected.splitlines(True),
  'output.txt', 'expected_1.txt')
sys.stdout.writelines(diff)
 
I don't think it's necessary to keep notes of this sort. Learn to use man pages or -- if you're on windows -- MSDN. Usually you can find what you're looking for if you even remember a simple related function. For example, suppose you're trying to remember which function searches a string for a substring. But you remember that strlen() computes the length. Search for that and pick your favorite site from the search results (good ones are MSDN and cppreference.com, and cplusplus.com) and you will find a list of related functions, one of which is strstr(), the function you're looking for.

So you really just have to remember the basics, and learn to find the rest
I completely agree with this, and I should also add one more thing.

man pages, in particular, are really wonderful if you're trying to code against a specific UNIX like OS. Mac OS X, Linux, and FreeBSD all have their own documentation written against their own implementations of the C standard library, plus several kernel services like kqueue (OS X/FreeBSD) and epoll (Linux).

I've already referenced Zed Shaw's Learn C the Hard Way in the previous reply, and he does teach how to use man pages in a command line crash course that's listed as a prerequisite for the book. They can be some of the best documentation out there.
 

Fou-Lu

Member
I am currently taking a first year programming course as it is required for me to graduate which I realized almost too late.

The course is for Java and I am finding myself really enjoying it and following along very well, but I have heard that Java is not a very good language. Why is that? I think I'd like to continue with programming after this course, so where should I go from here? (I'd like to be able to program some simple game ideas eventually as well as to be able to use programming for my work as a physicist).
 
The course is for Java and I am finding myself really enjoying it and following along very well, but I have heard that Java is not a very good language. Why is that? I think I'd like to continue with programming after this course, so where should I go from here? (I'd like to be able to program some simple game ideas eventually as well as to be able to use programming for my work as a physicist).
It's not so much if Java is good so much as what Java is good for. This is my understanding of how Java got to where it is today.


Let me set aside the general tenor of startup-focused internet programming communities and how they don't like Java because it's perceived to be old and slow, and Oracle owns the IP and controls development behind it. It's better to explain how it got here.


It was adopted by IBM at a very critical juncture in the mid 90s, when they were deciding what programming language and environment they should be pushing for less experienced programmers in IT environments that wasn't as hard to use as C, C++ and friends. For that reason, Java's been a pretty critical mainstay of Fortune 500 IT departments and other "enterprise" domains, government software, that stuff. And most development around the language has focused on appealing to those customers.

Sun and IBM were major developers of software and mainframe solutions, and Java was widely adopted by their customers, because at the time they were also pushing to provide a very high level of support for their Fortune 500 customers. Oracle does this, as well. And when considering the future direction of the Java language, they've listen to those customers first and foremost.


So Java is very good about garbage collection, in preventing collection cycles that can show up in JavaScript and other languages with a less sophisticated garbage collector. But it's also not fine tuned out of the box for real time performance, and its legacy as a "better C++" with a focus on Simula-like objects in terms of language design make it a bit inflexible in some ways compared to Python and other scripting languages, and even C's function abstraction to some degree.


I don't consider "knowing Java" to be a handicap, but I do think at some point, it would be a good idea to learn other programming languages. Scripting languages like Python and Ruby and JavaScript (via Node.js) are popular in startups for server side applications. C++ is popular for certain kinds of games programming, but garbage collected, "managed" languages like C# (via Unity) and even Java can work in a pinch if your needs for real time graphics aren't so resource intensive. Vanilla C is kind of trendy in some settings, but few people use it for applications programming, it tends to be more popular when you're dealing with kernel extensions and more to the metal forms of programming. And if you plan on doing any kind of front end web programming, JavaScript with the same markup languages everybody else has to put up with is an absolute must.

And before I forget! Java can be used in mobile through native apps for Android, though you'll have to get used to the elaborate Android GUI frameworks and understanding how to keep your app fast on what amount to less powerful computers by keeping garbage collection cycles short. There's new things to learn for every platform, even if you use the same language. iOS has a similar dependency on its Objective-C frameworks, and much of the time spent in learning the platform amounts to learning how to leverage that toolkit, not really the language.

Think of each language in terms of the domain that it's best suited for. If you'd like, pick up something suited for the kind of job you plan on getting someday, or for what interests you. If you'd like to stick with Fortune 500 and enterprise, Java right out of college is fine, and even for indie games Java's perfectly fine, though it lacks a game dev environment as rich as Unity to really propel it forward. Android pushed Java a little bit out of the enterprise, but I consider mobile development to be different enough in terms of what you have to look for that it's not something you'll jump into immediately.


There's challenges everywhere to be overcome. Pick and choose what you want to solve, see how it goes, move on from there. You won't stop learning as a software developer, trust me.
 
ProgrammerGAF, quick question. As I'm not proficient or very programmer-minded, I'm having this odd problem in C# that I can't quite solve.
I have this code to get bytes from specific positions of a byte array to store as an int variable. However, if the index is 0, I get some random values whereas if it's 1, I get the correct bytes starting from 1. And then of course, position 4 works as intended, so I'm not sure what I did wrong between the two.
Code:
int var0x00 = BitConverter.ToInt32(MyByteArray, 0x00);
var0x00 = ReverseBytes(var0x00);

int var0x04 = BitConverter.ToInt32(MyByteArray, 0x04);
var0x04 = ReverseBytes(var0x04);
 
ProgrammerGAF, quick question. As I'm not proficient or very programmer-minded, I'm having this odd problem in C# that I can't quite solve.
I have this code to get bytes from specific positions of a byte array to store as an int variable. However, if the index is 0, I get some random values whereas if it's 1, I get the correct bytes starting from 1. And then of course, position 4 works as intended, so I'm not sure what I did wrong between the two.
Code:
int var0x00 = BitConverter.ToInt32(MyByteArray, 0x00);
var0x00 = ReverseBytes(var0x00);

int var0x04 = BitConverter.ToInt32(MyByteArray, 0x04);
var0x04 = ReverseBytes(var0x04);

There really isn't enough information in your post. For starters, missing here is the code that puts the data into the byte array. Almost certainly important. Secondly, it's a little strange that you're reversing the bytes. The whole point of BitConverter is so that you don't have to worry about endianness. If you didn't reverse the bytes putting it into the byte array, you shouldn't have to reverse them getting it out. And if you did reverse them putting it in, then why?
 

Chris R

Member
Use
Code:
BitConverter.IsLittleEndian
to check the endianess of the system your code is running on.

Also, I've tried the function on my machine and it worked just fine.

Code:
byte[] bArray = new byte[]{0x44,0x33,0x22,0x11, 0x00, 0x00};

int a = BitConverter.ToInt32(bArray, 0);
// a = 0x11223344 or 287454020

int b = BitConverter.ToInt32(bArray, 2);
// b = 0x00001122 or 4368
 
There really isn't enough information in your post. For starters, missing here is the code that puts the data into the byte array. Almost certainly important. Secondly, it's a little strange that you're reversing the bytes. The whole point of BitConverter is so that you don't have to worry about endianness. If you didn't reverse the bytes putting it into the byte array, you shouldn't have to reverse them getting it out. And if you did reverse them putting it in, then why?
I'm working with a byte array I copied from an emulator memory dump, so the endianness won't be the same as the system. But I do expect it to be in the order they were copied in.
Use
Code:
BitConverter.IsLittleEndian
to check the endianess of the system your code is running on.

Also, I've tried the function on my machine and it worked just fine.

Code:
byte[] bArray = new byte[]{0x44,0x33,0x22,0x11, 0x00, 0x00};

int a = BitConverter.ToInt32(bArray, 0);
// a = 0x11223344 or 287454020

int b = BitConverter.ToInt32(bArray, 2);
// b = 0x00001122 or 4368

It works with your array, even with the ReverseBytes thing I have. But when it's my array (0x34, 0x95, 0x91, 0x8A), the resulting value of int a wouldn't even be close after flip. I'm starting to suspect that, could it be because of signed and or unsigned integers?
 

Chris R

Member
It works with your array, even with the ReverseBytes thing I have. But when it's my array (0x34, 0x95, 0x91, 0x8A), the resulting value of int a wouldn't even be close after flip. I'm starting to suspect that, could it be because of signed and or unsigned integers?

0xA8195934 is larger than 0x7FFFFFFF (the largest int allowed by Int32) so it does look like you are dealing with overflow.
 
I am currently taking a first year programming course as it is required for me to graduate which I realized almost too late.

The course is for Java and I am finding myself really enjoying it and following along very well, but I have heard that Java is not a very good language. Why is that? I think I'd like to continue with programming after this course, so where should I go from here? (I'd like to be able to program some simple game ideas eventually as well as to be able to use programming for my work as a physicist).

Java has a bad reputation from a decade ago when it was awful and slow. My understanding is that it is a lot better these days (but I haven't been in a situation where I needed to use it). A friend of mine from college does a lot of embedded device programming in Java and he actually enjoys it quite a bit.
 

Nesotenso

Member
have a question about pointers and multidimensional arrays.

Code:
int C [3][2][2];
int *p [2][2] = C

if I print *C it gives me the base address. but if I had a one dimensional array here like this

Code:
int a = {2,3,4,5};
int *q = a;

printing *a gives me 2. Why are *C and *a valid for arrays and not for other data types? and why do I get address of first element for the multidimensional case?
 
Java has a bad reputation from a decade ago when it was awful and slow. My understanding is that it is a lot better these days (but I haven't been in a situation where I needed to use it). A friend of mine from college does a lot of embedded device programming in Java and he actually enjoys it quite a bit.
Computers and embedded devices also got quite a bit faster since 1995. Managed languages like C#, Java and Smalltalk didn't really become viable platforms for software until 33 MHz processors were a thing of the past. And by that time, Smalltalk didn't have any big corporate backers left.

Java was used in cell phone programming prior to Android, too... a decade ago, as you say. It wasn't fast and fluid on the low spec hardware it was running on, but it was good enough to make calls and play simple games on Motorola Razrs and the like.
 

Haly

One day I realized that sadness is just another word for not enough coffee.
have a question about pointers and multidimensional arrays.

printing *a gives me 2. Why are *C and *a valid for arrays and not for other data types? and why do I get address of first element for the multidimensional case?
Arrays are pointers, which is why you can dereference them using *. When you initialize an array:
Code:
int a[] = {2,3,4,5};
a is just a pointer to a block of memory containing 2.

Multidimensional arrays are arrays of arrays. When you dereference C, you're seeing the first pointer of the first array that makes up the second dimension, which, in your 3 dimensional case, contains yet another pointer to the first array comprising the third dimension.

If you want to see what's stored in C[0][0][0], you would need to triple dereference:
Code:
cout << ***C;
 

Nesotenso

Member
Arrays are pointers, which is why you can dereference them using *. When you initialize an array:
Code:
int a[] = {2,3,4,5};
a is just a pointer to a block of memory containing 2.

Multidimensional arrays are arrays of arrays. When you dereference C, you're seeing the first pointer of the first array that makes up the second dimension, which, in your 3 dimensional case, contains yet another pointer to the first array comprising the third dimension.

If you want to see what's stored in C[0][0][0], you would need to triple dereference:
Code:
cout << ***C;

ok thanks that makes sense.
 
0xA8195934 is larger than 0x7FFFFFFF (the largest int allowed by Int32) so it does look like you are dealing with overflow.

Endianness is only byte-order, not nibble order. {0x34, 0x95, 0x91, 0x8A} is 0x8A919534. Which is still larger than 0x7FFFFFFF, so ultimately you're still correct. Just be careful :D

Luckily this problem is easy to solve. Use BitConverter.ToUInt32.
 

Chris R

Member
Endianness is only byte-order, not nibble order. {0x34, 0x95, 0x91, 0x8A} is 0x8A919534. Which is still larger than 0x7FFFFFFF, so ultimately you're still correct. Just be careful :D

Luckily this problem is easy to solve. Use BitConverter.ToUInt32.

Doh! missed that when I was doing my 30 seconds of testing :D
 

Two Words

Member
I need help copying a string into a c-string. For some reason, I'm getting a compile-time error that strcpy() isn't defined in the scope of main. Here is my code. I don't understand what I ma missing here.

Code:
#include <iostream>
#include <string>
#include <stdio.h>
#include <cctype>

using namespace std;

void bubbleSort(char *ptr);
int main()
{
    int size = 0;
    string words;
    char *ptr;

    cout << "Enter a string of characters that terminates with the enter key." << endl;
    getline(cin, words);
    size = words.length();
    ptr = new char [size + 1];
    strcpy(ptr,words.c_str());
    bubbleSort(ptr);
    cout << *ptr;
 

endre

Member
You're rebuilding after your changes right?

Thanks. That was the problem. But I still don't understand why doesn't it work when I rebuild the solution? I had to rebuild the project. I thought the second option in the build menu rebuilds everything.

NPqTfdg.png
 

Nesotenso

Member
is gdb the recommended dbugger when working with c programs? I was wondering what debuggers posters like CPP use.

Also can anyone direct me to tutorials which will teach me how to work with git using the command line interface?

also something which breaks down the basics of Github and how to use the site effectively?
are there other repositories besides github which make use of git?
 
is gdb the recommended dbugger when working with c programs? I was wondering what debuggers posters like CPP use.
Depends on what you're working on.

GDB's pretty solid and boy howdy does it have many features barely anybody knows about, but Visual Studio has its own debugger, and Intel have their own debugger extensions that work with Visual Studio. The LLVM project also has LLDB, a GDB alternative with a Python based scripting system for plugins.

They're all solid choices, but I think some are better suited to what compiler you happen to be using. GDB for GCC, Visual Studio Debugger for VS, Intel's debugging extensions for Intel's compiler, and so on.

GDB's documentation is pretty dense, and only really required to dig into if you're serious about using GDB from the command line. In all likelihood, you're going to use the debugger that came with your IDE (Visual Studio, Xcode, Eclipse, and so on) and you won't worry about command line wizardry and special tricks until you're really comfortable with programming.
Also can anyone direct me to tutorials which will teach me how to work with git using the command line interface?

also something which breaks down the basics of Github and how to use the site effectively?
The official Git website has all of Scott Chacon's Pro Git available for free under their documentation section. It covers Github too.
are there other repositories besides github which make use of git?
LOTS! :) Bitbucket, Google Code, Sourceforge, Gitorious, GNU Savannah, CodePlex and countless others.

I think Github's the nicest to use. It's the most popular by far.
 
Try git is also a nice "Learn git basics in 15 minutes" tutorial:

https://try.github.io/levels/1/challenges/1


I myself love git. Few years ago when I was still in school, I really couldn't figure out how to properly use version control. There (well, a subsidiary of the school I was working at), we sort of used Subversion with code projects (as pretty much just a backup protocol) and then I battled hours and hours with Perforce and Unreal Engine projects. When I tried git for the first time I was blown away. It was just so simple yet so robust. I am constantly learning new things about it, but at the same time, at any point, have I not been able to perform something just because I didn't know how.
 
I referred to this during my first forays into git: http://rogerdudler.github.io/git-guide/

Of course later on I installed TortoiseGit and just forgot about all that command line stuff.

Once you know the command line well, the GUI-based GIT systems just get kind of annoying, at least for me. I use SourceTree on Windows but I tend to use the built-in terminal quite often. The SourceTree UI is great for quickly looking through diffs but if I want to do anything non-trivial it's just easier/faster to command-line it. I've tried other GUI's as well but ultimately I end up using the command-line more than anything.
 

NotBacon

Member
I've tried tortoise, Github GUI, git GUI, and none of them compare to the power and flexibility of command line git.
Plus when the GUI fails my coworkers I get to laugh at their struggle with the CLI until they ask for my help.
 
I almost feel guilty for loving Haskell as much as I am. It's been a joy to just continue learning more about it and watch ideas just naturally fall into a program file (especially compared to Prolog. Ugh...).

Seriously, somebody burst my bubble and tell me why it sucks.
 
Seriously, somebody burst my bubble and tell me why it sucks.
I'd rather not.

I mean, there's slightly esoteric languages out there like Erlang, Eiffel, Forth and such that are just kind of fun to learn, and really make you un-learn some bad habits while you use them. And I could pull up some BS statistics from the computer language benchmarks game to say why X language is better than Y or mention some etherial, poorly defined programming trend as a feature to say what's good and bad.

If you can pull something together that's useful, cool!

Just don't fall in love with a programming language. In the long run, it'll never return the favor.
 
I almost feel guilty for loving Haskell as much as I am. It's been a joy to just continue learning more about it and watch ideas just naturally fall into a program file (especially compared to Prolog. Ugh...).

Seriously, somebody burst my bubble and tell me why it sucks.

What tutorials or books did you use? I've been meaning to pick up Haskell for a long time.
 
What tutorials or books did you use? I've been meaning to pick up Haskell for a long time.

Just go here. I've been using "The Haskell Road to Logic, Maths, and Programming" under 2.1 and "Learn You a Haskell for Great Good!" under 2.2 concurrently. The latter does a better job of introducing concepts while the former does a better job of explaining things mathematically.
 
is gdb the recommended dbugger when working with c programs? I was wondering what debuggers posters like CPP use.

Also can anyone direct me to tutorials which will teach me how to work with git using the command line interface?

also something which breaks down the basics of Github and how to use the site effectively?
are there other repositories besides github which make use of git?

I use Visual Studio debugger for most tasks. When i really want to get nasty or do postmortem debugging or crash dump analysis i use WinDbg.

One day maybe I'll use LLDB.

GDB is a total nonstarter for me, because 99% of my work is on Windows, and GDB cannot debug native Windows programs.

Command line debuggers in general really piss me off, because I like the "asynchronicity" of being able to view my source independently of running commands. In a command line debugger, you print your source code, then you run a few commands and your source code is off the screen. Really ruins the experience for me. That's one reason I like WinDbg, because it's a command line debugger, but it's also a graphical debugger. So it displays your source separately.

There's front ends to command line debuggers like DDD, or Eclipse CDT, but they always do a poor job in my opinion, because you can tell that the interfaces to the debugger are really just wrappers around the command line, and so the authors of the debugger lose a lot of the control needed to really create a responsive graphical debugger.
 
Command line debuggers in general really piss me off, because I like the "asynchronicity" of being able to view my source independently of running commands. In a command line debugger, you print your source code, then you run a few commands and your source code is off the screen. Really ruins the experience for me. That's one reason I like WinDbg, because it's a command line debugger, but it's also a graphical debugger. So it displays your source separately.
There is a split screen mode in GDB that shows commands on the bottom half of the screen and your source at the very line you're paused at on top.

Learned about this when porting games for a certain bundle. It was a revelation at the time.


EDIT: More directions on how to use it, thanks to very extensive official GDB documentation.
 
There is a split screen mode in GDB that shows commands on the bottom half of the screen and your source at the very line you're paused at on top.

Learned about this when porting games for a certain bundle. It was a revelation at the time.


EDIT: More directions on how to use it, thanks to very extensive official GDB documentation.

Yea, not a fan of TUIs either. The idea of using a console when i have a nice colorful window with docking, multi monitor support, syntax highlighting, configurable window locations, different types of windows (stack frame, watch, memory, source, breakpoint list, etc) is kind of foreign to me.
 
Yea, not a fan of TUIs either. The idea of using a console when i have a nice colorful window with docking, multi monitor support, syntax highlighting, configurable window locations, different types of windows (stack frame, watch, memory, source, breakpoint list, etc) is kind of foreign to me.
You already have most of the windowing conveniences when running GDB in a good terminal emulator in X11, Wayland or OS X (Terminator, iTerm 2). There's a Vimmish/Emacsish learning curve for the various modes and commands in GDB, but they're there.

But then we're talking Windows versus the POSIXes. :) And as wonderful as GDB for shockingly good reverse engineering with stripped binaries and general purpose debugging on not-Windows, it's still a not good fit for Windows.

It was mana from heaven for a certain kind of Android debugging and on RasPi.


Now, if we were talking about Android GPU debugging, holy moses is that a certain kind of hell. Custom kernels, service contracts to get specific kernel extensions if you didn't have the right tablet/phone from a very specific vendor, debuggers written by hardware guys, everything closed source. Mmmmm!
 
On the subject of debug tools, if you're looking for performance optimization on Linux, Brendan Gregg (formerly of Sun, Joyent, and the primary author of the DTrace book) has has a rather good blog showing the state of performance tracing tools on the platform.

Linux Performance Tools 2014 gives a good state of affairs, and he has a devoted page to Linux Performance showing off all of his works explaining the various tools and how they work. And of course he wrote the best-selling book on the subject.

This is more sysops level stuff, but performance profiling to see where your app is spending the most time can really come in handy.

OS X has Instruments and a port of DTrace that hasn't been updated in some time, but which does form the basis for custom instruments on OS X. There are a few excellent tutorials out there for OS X. FreeBSD has its own DTrace with some perfectly good documentation of its own.
 

Bleepey

Member
I was impressed with these three:

R For Dummies

- Generally Wiley's Dummies books are AWFUL for teaching programming, but this one's actually very approachable and gives the best treatment of the cross-platform R IDE, RStudio, that I've seen. Very nicely put together. Also handles some seemingly obvious questions that'll come up in the workplace like "how do I use R with an Excel spreadsheet."

The Art of R Programming

- More advanced than the previous book, and it assumes some programming knowledge, but it covers more interesting material like what you can do with certain R packages and gives a thorough treatment of what the language can do. Seems like a good one to help build your knowledge on.

The R Cookbook

- O'Reilly has a series of "cookbooks" that are basically simple solutions in code for very specific problems. You could go through these books back to front and learn a fair amount, or keep them on your shelf for when the time comes that you need something non-trivial. Recently, they also put out an R Graphics Cookbook which you might want to take a look at later if this one suits your fancy.


The official R documentation has struck me as extremely hard for beginners to hit the ground running with, and before O'Reilly, No Starch and even Wiley threw their hats into the game, the best resources for learning R were expensive books from Springer that weren't much good if you wanted to use a not-Windows operating system. Hopefully these help you out.

To add to this:

cran.r-project.org/doc/contrib/Epicalc_Book.pdf

I think this is fantastic and is a great idiots' guide to r with practical examples, explanations and answers. It only focuses on epidemiology which is fine for me, but you may wanna use it for something else. I am gonna make a thread about R when I finish going through this book.
 
Decided to try out Atom in place of Sublime. I want to like Atom but there is one thing that it does that erks me. Going through a project folder, any file you click automatically opens a new tab. They should change it so that a new tab with that file isn't created until you double click the file. Single clicking the file just opens it without assigning it to a tab.

It's nice though, not sure if I'd replace Sublime with it...but it's nice.

Agreed.

I have some skepticism against developers who use windows and GUIs. One of those is bad enough, two of them is a red sign for me. They could change my perception of them eventually, but my first reaction is yuck.

Every Git app I've seen always seems overly complicated. That and I always believe CLI help instill good practices.
 
Agreed.

I have some skepticism against developers who use windows and GUIs. One of those is bad enough, two of them is a red sign for me. They could change my perception of them eventually, but my first reaction is yuck.

Some of the best engineers in the world are Windows-only. Of course some of the best engineers in the world are Unix-only too. If Windows and GUIs are a red flag for you, it sounds like you have a bias which is not substantiated with actual facts.
 
Some of the best engineers in the world are Windows-only. Of course some of the best engineers in the world are Unix-only too. If Windows and GUIs are a red flag for you, it sounds like you have a bias which is not substantiated with actual facts.
Yeah, I wouldn't group MinGW/Cygwin programmers with native Windows devs.

Windows is still unquestionably the most pleasant platform for doing GPU work, better drivers and support all around. And Microsoft has great tools for native devs, more stable and robust than Apple's GUI-based dev tools.

It's just, blend tools written for POSIX that are not Git with the Windows kernel and its own native dev tools, and you generally get a mess. Google's efforts to port LLVM infrastructure to Windows notwithstanding.
 
And since that post struck a nerve, here's a couple of blogs from some world class Windows engineers

https://randomascii.wordpress.com/
http://blogs.msdn.com/b/oldnewthing/
http://www.virtualdub.org/

Add these to your RSS feed and tell me that your first reaction is yuck.

And for every one that has a blog, there's tens of thousands that don't. I work with some of them.

There's plenty of shit Unix programmers too. The two best indicators of how good of an engineer someone is are a) The number of people that the product they work on impacts (i.e. if your software is used by a billion people, I *absolutely guarantee* that you are a better engineer than someone whose software is used by a million people), and b) The length of time they've been in the industry. *Not* what language they work in or what platform they work on. Windows drives a majority of the PCs in the world. It didn't get there by having shitty programmers.

I would actually say there's more shitty unix programmers than shitty windows programmers (if we're limiting ourselves to native code). The reason is that it's hard to just accidentally become a windows programmer. You have to be dedicated. It's not the thing you learn in university, for starters. It's not open source and there's less free material available and introductions. Despite the fancy GUIs and the tools, it's *less* accessible. The fact that we prefer GUIs instead of command line just means we like to focus on getting shit done instead of remembering what permutation of command line options to mash together to get the output we need in the format we want.
 
Top Bottom