Data science has been a hot term in the past few years. Despite this fact (or perhaps because of it), it still seems like there isn’t a single unifying definition of data science. This post…]]>

Data Scientist (n.): Person who is better at statistics than any software engineer and better at software engineering than any statistician.

— Josh Wills (@josh_wills) May 3, 2012

One of my reasons for doing a PhD was wanting to do something more interesting than “vanilla” software engineering. When I was in the final stages of my PhD, I started going to meetups to see what’s changed in the world outside academia. Back then, I defined myself as a “software engineer with a research background”, which didn’t mean much to most people. My first post-PhD job ended up being a data scientist at a small startup. As soon as I changed my…

View original post 632 more words

]]>

This is the first post that makes justice to the blog’s motto: show me the code motherfucker. In this and the next n posts with the title “Neural networks in a…]]>

This is the first post that makes justice to the blog’s motto: **show me the code motherfucker**. In this and the next *n* posts with the title “Neural networks in a nutshell – *k*” I will talk about artificial neural networks, showing concepts (theory) and code (practice). The codes will be written in Python without any fancy library as NumPy, SciPy or PyBrain just because:

- I don’t know how to use any of these.
- I don’t have time to learn them now.
- The focus is in the concepts, not in the performance.

View original post 775 more words

]]>

Kronecker famously wrote, “God created the natural numbers; all else is the work of man”. The truth of this statement (literal or otherwise) is debatable; but one can certainly view the other standard number…]]>

Kronecker famously wrote, “God created the natural numbers; all else is the work of man”. The truth of this statement (literal or otherwise) is debatable; but one can certainly view the other standard number systems $latex {{bf Z}, {bf Q}, {bf R}, {bf C}}&fg=000000$ as (iterated) completions of the natural numbers $latex {{bf N}}&fg=000000$ in various senses. For instance:

- The integers $latex {{bf Z}}&fg=000000$ are the additive completion of the natural numbers $latex {{bf N}}&fg=000000$ (the minimal additive group that contains a copy of $latex {{bf N}}&fg=000000$).
- The rationals $latex {{bf Q}}&fg=000000$ are the multiplicative completion of the integers $latex {{bf Z}}&fg=000000$ (the minimal field that contains a copy of $latex {{bf Z}}&fg=000000$).
- The reals $latex {{bf R}}&fg=000000$ are the metric completion of the rationals $latex {{bf Q}}&fg=000000$ (the minimal complete metric space that contains a copy of $latex {{bf Q}}&fg=000000$).
- The complex numbers $latex {{bf C}}&fg=000000$ are the algebraic…

View original post 9,620 more words

]]>

It is widely forecasted that a shortage of skills in data science and analytics will mean a great deal of money is wasted through missed opportunities in coming years. Traditional academic establishments have begun to move to fill the gap. However, most courses teaching the hot topic skillsets such as […]

Sourced through Scoop.it from: www.forbes.com

Great free content here:

]]>

For those who aren’t regular readers: as a followup to this post, there are four posts detailing the basic four methods of proof, with intentions to detail some more advanced proof techniques in the future.…]]>

*For those who aren’t regular readers: as a followup to this post, there are four posts detailing the basic four methods of proof, with intentions to detail some more advanced proof techniques in the future. You can find them on this blog’s primers page.*

Remember when you first learned how to program? I do. I spent two years experimenting with Java programs on my own in high school. Those two years collectively contain the worst and most embarrassing code I have ever written. My programs absolutely reeked of programming no-nos. Hundred-line functions and even thousand-line classes, magic numbers, unreachable blocks of code, ridiculous code comments, a complete disregard for sensible object orientation, negligence of nearly all logic, and type-coercion that would make your skin crawl. I committed every naive mistake in the book, and for all my obvious…

View original post 4,381 more words

]]>

Zmob, my first (and only) original game. By the end, the breadth and depth of our collective knowledge was far beyond what anyone could expect from any high school course in any subject. Education Versus Exploration I’m…]]>

Zmob, my first (and only) original game.

*By the end, the breadth and depth of our collective knowledge was far beyond what anyone could expect from any high school course in any subject. *

I’m a lab TA for an introductory Python programming course this semester, and it’s been…depressing. I remember my early days of programming, when the possibilities seemed endless and adding new features to my programs was exciting and gratifying, and I brimmed with pride at every detail, and I boasted to my friends of the amazing things I did, and I felt powerful. The world was literally at my fingertips. I could give substance to any idea I cared to entertain and any facet of life I wanted to explore. I had developed an insatiable thirst for programming that has lasted to this very day.

My younger self, if programming were more noodley.

The ironic thing is…

View original post 3,876 more words

]]>

In 1989-90, under the direction of Jack Welch, GE launched “Work-Out” – a team based problem-solving and employee empowerment program modeled after the Japanese quality circles model that was in vogue at the…]]>

Speed, Simplicity and Self-Confidence

In 1989-90, under the direction of Jack Welch, GE launched “Work-Out” – a team based problem-solving and employee empowerment program modeled after the Japanese quality circles model that was in vogue at the time. Work-Out was a huge success and Welch was frustrated by the rate of adoption through the business. Welch, the visionary, realized that GE (and everyone else!) was entering an era of constant change, and that those who adapted to change the fasted would be the survivors. He commissioned a team of consultants (including Steve Kerr, who was to become GE’s first Chief Learning Officer) to scour industry and academia to study the best practices in change management and come back to GE with a tool kit that Welch’s managers could easily implement. The result was the Change Acceleration Process, commonly referred to within GE simply as “CAP.”[1]

The team studied hundreds…

View original post 783 more words

]]>

So much advanced math relies on a firm grasp of basic Algebra and Algebra II. Today, lets take a look at logarithms! So what are logarithms? Well, first let’s look at exponential equations, such as…]]>

So much advanced math relies on a firm grasp of basic Algebra and Algebra II.

Today, lets take a look at logarithms!

So what are logarithms? Well, first let’s look at exponential equations, such as $latex 2^x = y$ where the 2 is a base. We all know that for example, $latex 2^3 = 8$. A general form is $latex b^x = y$ where b is the base. Well, with logarithms, the format is $latex log_b y = x$. So for $latex 2^3 = 8$, we would express that with logarithms as $latex log_2 8=3$. Fun, isn’t it! The logarithm is the number that the base is raised to a power by to equal a given number; in the example above, the base 2 is raised by the power 3 to equal the number 8.

So the tricky part is that you get rules like $latex log_b y + log_b…

View original post 850 more words

]]>

]]>

John Tukey, one of the developers of the Cooley-Tukey FFT algorithm. It’s often said that the Age of Information began on August 17, 1964 with the publication of Cooley and Tukey’s paper, “An Algorithm for…]]>

John Tukey, one of the developers of the Cooley-Tukey FFT algorithm.

It’s often said that the Age of Information began on August 17, 1964 with the publication of Cooley and Tukey’s paper, “An Algorithm for the Machine Calculation of Complex Fourier Series.” They published a landmark algorithm which has since been called the Fast Fourier Transform algorithm, and has spawned countless variations. Specifically, it improved the best known computational bound on the discrete Fourier transform from $latex O(n^2)$ to $latex O(n log n)$, which is the difference between uselessness and panacea.

Indeed, their work was revolutionary because so much of our current daily lives depends on efficient signal processing. Digital audio and video, graphics, mobile phones, radar and sonar, satellite transmissions, weather forecasting, economics and medicine all use the Fast Fourier Transform algorithm in a crucial way. (Not to mention that electronic circuits wouldn’t exist without Fourier analysis in general.)…

View original post 2,500 more words

]]>