Jeffrey van

The Big Security Checklist

Is your website or application secure or will your company be the next one leaking all customer details? The internet is filled with people looking for flaws on every website they can find and some don't have good intentions. In the past I was also one of the people crawling the internet for anything interesting. During my research I discovered that there were security flaws everywhere. Now it is time for me to help you and your company to prevent (most) of those flaws.

Never trust external data

Any data coming in your (web)application can contain something you didn't expect. The values can be changed or the format can be different than what you expected. For websites that will be everything in a web request (url, get/post parameters, Other posted content, Cookies, Referer, User-agent, etc.). For other applications this can be everything it reads from a file, url, socket or even UI input.

A few examples of results when not checking your received data are: SQL Injections (you get a query instead of just a number), Buffer overflows (Usually able to run own code), Errors which give some more info about your environment, Uploaded scripts instead of uploaded images. Accessing admin features (missing authentication checks), XSS attacks (User submitted HTML/JavaScript on your site). It can also be something as stupid as ordering non-existent product plans: When your website gives the user to choose between 1 and 2, and the attacker changes the form so it submits 3 or -1 or the famous JGDGDSKJGDSJSDG.

The big amount of glitches/hacks in games is purely possible because the client is allowed to send his own (modified) location and inventory. Note that for games this is usually done because of performance and latency reasons (You don't really want to have your server check every collision of every player movement). Games can encounter this by doing simple checks (Is the player moving faster than allowed?) and having report systems.

This also applies on anything encrypted! Encryption only prevents people in between to see the data (and usually tampering with it). The data is still encrypted on the machine of a user. A user can just attach a debugger to every process and see/change the real data. SSL connections are usually even easier: You can just create a self signed certificate, trust that certificate, redirect a connection to go to your own tunnel software which tunnels the connection to the real server. This allows you to see and change any data being sent to the server and back.

Don't just trust data coming from another server in your network either as someone can gain access to that machine and start doing the same tricks.

Give your application only access to the resources it needs

This is the point I've seen most companies fail at. Usually an attacker will come in using a leak in your web application. Web applications can be quite big and they are facing the internet. It just takes one small bug/undocumented feature in your (or your framework/server) code and Mr. Anderson will be able to do more than he is supposed to.

First your database connection. Is this done under a user which only can access the website database and not anything else? Good! SQL injections can still happen and database servers allow some nasty queries, like executing commands or reading files. Sometimes I saw that the old 'mysql.user' table was accessible with all the bruteforce-able password hashes of all users. Most of the times they even had a /phpmyadmin ready to use with the just gained credential information. Easy-mode! Even with just access read access to the right table you can find out the Admin-user password hashes and brute force them, giving your thankful hacker a new area to find even more and bigger security holes.

File access is also something you should pay attention to. Often a website is hosted on a web server with multiple other websites. Think about what a security researcher can see when looking in your inetpub/wwwroot folder: (Decompilable) source codes, config files, log files. Always run a website under a limited user account with only (read) access to the folders it needs to access. You wouldn't be the first to get broken into because of an old forgotten promotion site. This is also why you should be aware to host your website on a shared web host: One flaw in their configuration, and all websites are compromised. You would be surprised how often those big web hosts make mistakes on this area.

Network access to any other resources: What servers are accessible from your web or application server? Are there any file shares or internal web services which doesn't require any authentication? Is the data between servers encrypted?

Remember: Gaining administrator-access to everything is not the goal of a security specialist. It is to gain access to interesting data, or giving him any advantages normally not possible (For example with a web shop: Marking an order as paid or changing prices).

Stay up-to-date

Software bugs happen a lot. Even a small piece of software written by the best group of programmers is bound to have a few bugs. Luckily new versions with bug fixes are released to counter those bugs (often even before they are known to the public). A down side with releasing those new versions: People can compare the versions and find out what bugs were fixed.

Always make sure applications on your environment are on the latest stable version, including your Operation System. Any flaw in any application on a server can be helpful when trying to access even more resources. A flaw in your server application may allow a user to run code under a very limited user account, but by also using a flaw in an unpatched Windows installation the code can suddenly be run evaluated and do everything.

Limit access of people

You may have the world most secure system but you also still have people in your company with access to resources to be able to do their jobs. Those people are a common attack point for security specialists. It is probably the most difficult one to deal with when protecting yourself from nasty security incidents.

Educate people to recognize phishing attempts and test them often. Teach people that an attack may come from anywhere (mail, phone, in real life or even in their personal life) from anyone (customers, friends, co-workers with less access). Never leave your computer unlocked and don't just run anything. Maybe that program a friend downloaded for you is infected with a trojan because he used an untrustworthy source. That program can log your keystrokes and screen when you log in when working at home a month later.

Also update the software on all workstations. Keep track what people are running and on which version it is. Disable applications and plugins an employee doesn't need (like the Flash or Java browser plugin). A common way of attacking nowadays is by putting some malicious advertisement on big popular websites which get probably visited by half of your company's employees. Those advertisements can use exploits (Browser, Java, Flash, etc.) to run code on the machine without the user even noticing. Of course run a decent virus scanner, but be aware that it won't stop everything.

An attack can even be from the person itself. Audit and monitor everything and don't give the person access to more resources than he/she needs. Do remember that you're dealing with people, please be thoughtful of what limitations you apply and how users will experience them. When people need access to something for an assigned task, don't make them fill in 3 forms and wait 8 weeks.

Make logging in a bit more secure by adding two-factor authentication whenever possible (Not just your employees, your website users as well). Give instructions on how to choose a safe password and enforce it by applying password restrictions. Do not use easy guessable default passwords for new employees and do force them to change it at the first time they log in.

Monitor any suspicious behavior

Watch any errors happening on your websites. When poking around your overly interested visitor will usually trigger some errors before finding anything useful. Pay attention to requests after that: If the errors stop he may have given up, or he have found something. For a security researcher errors are usually a sign that input is badly/not checked. Just hiding the errors is not enough, there are more ways to notice that you are able to do more than you're supposed to.

Suspicious behavior can also happen from within a company. Is a person trying to access a lot of resources? Pay attention to it and find out if it was for a valid reason.

Be thankful to those reporting a flaw

If you're very lucky, someone found a flaw on your website and actually reported it to you instead of abusing it. Of course it goes without saying that dealing with this flaw must be your number one priority. But also reward the person for reporting it. For you it is just a small gift, for them it is a reason to stay nice and keep reporting bugs to companies. Most places I've reported serious security flaws to didn't respond or started threatening me directly. As you would probably understand, the choice of 'Reporting the bug' will be less obvious after that.


Breaking in a system is like puzzling, and with most puzzles: People who do it often get quite good at it. It is not uncommon for a security researcher to use multiple holes in different systems to get to the interesting data. Always expect to get weird data in your applications and add checks for it. Make sure your software is on the latest stable version and limit what is possible when having access to a part of the environment.

Review your security often for your current environment but also review your new software designs for security flaws. When in doubt: Let a professional security specialist check your security.

Awesome JavaScript Tricks

Just like C-Sharp, JavaScript has some unique tricks not known to everyone. In this article I'll share some of the tricks I know about JavaScript. Like everything with JavaScript, some tricks will require a recent version of your web browser but all tricks should work in all major browsers.


This is a method available in all recent browsers (Firefox 4+, Chrome 5+, Safari 5.1+, Opera 11.6+, IE 9+) which allows you to do unique and awesome stuff with properties. For example you can use Getters and Setters:

var obj = {};

Object.defineProperty(obj, 'myprop', {
get: function() {
return myprop * 2;
set: function(value) {
alert('Setting value to '+value);
myprop = value;

obj.myprop = 5;
alert(obj.myprop); // 10

Another use is making a property read only and unchangeable (not even by another defineProperty call).

var obj = {};
obj.theAnswer = 42;

Object.defineProperty(obj, 'theAnswer', {
writable: false, // This makes the value read only
configurable: false // This makes this property undeletable

obj.theAnswer = 5;
alert(obj.theAnswer); // 42

The defineProperty method can be useful when making a custom elements, which can act just like a native element does (for example, directly updating the UI after you do: customElement.value = "123";).

Arguments as array

You can use the arguments keyword to get all parameters. Use this to make functions which accepts a unspecified range of arguments.

function printf(text) {
var i = 0, args =; // Turn the arguments-object into a normal array

text = text.replace(/\%s/g, function(a) {
return args[++i];

return text;

alert(printf("Hello %s! Your score is %s", "Mr Anderson", 1337));
// Hello Mr Anderson! Your score is 1337

edit: Changed [].slice to after someone pointed out it had some issues with Chrome

Override default methods

Like in every programming language, Objects in JavaScript have a few default methods you can override.

var TestObject =  function() {
this.valueOf = function() {
return 42;
this.toString = function() {
return "I am a test object";

var obj = new TestObject();
alert(obj == 42); // true
alert(obj); // I am a test object

This can be useful when making objects comparable to each other or making your code easier to debug (You can get a nice textual representation of your object instead of just a [object] text).

Function methods

In JavaScript, there are quite a few ways of working with functions. A more unknown feature are the methods available at functions. For example: By using the method apply you can call a function and give the parameters as an array, you can also change the 'this' scope of that function.

alert.apply(this, ["I am a message"]); // Shows 'I am a message'

The bind method can also be quite useful, especially when working with timeouts and intervals.

var TestObject = function() {
this.someValue = 100;
this.doSomething = function() {

setTimeout(this.doSomething, 1000); // undefined
setTimeout(this.doSomething.bind(this), 2000); // 100

new TestObject();

In Firefox, you can also retrieve the source code of a function:

function myFunction() {
return 1337;
alert(myFunction.toSource()); // Shows the function defined above including source

With keyword

Supported by pretty much every browser, but not recommended for use as it can causes a lot of confusion. Still for the sake of informing I'll include it in this overview. The With keyword allows you to use properties and functions from an object, without putting the object name in the front (As if all properties and functions are global).

with(document) {    
with(body) { // document.body
var textNode = createTextNode("Bla"); // document.createTextNode
appendChild(textNode); // document.body.appendChild

It also works when eval-ing code, Which can be useful if you want to support scripts with short function names. Note that it does not limit the script! Every other function will still be available. It can also be used to temporarily override global functions and properties:

var obj = {
alert: function() {}
with (obj) {
alert('This alert is not visible');

Comma operator

Next to all the more common operators, JavaScript has another operator: The Comma operator. The comma operator allows you to do some actions before returning a value. This can be useful in places where JavaScript only expects a single value. It can be compared to the && and || operators, but with the comma-operator all expresssions are always executed. Example:

function IsAlwaysFalse() {
return false;

function IsAlwaysTrue() {
return true;

// Continues when the value is not False/Null/0/Undefined
alert(IsAlwaysFalse() && 42); // False
alert(IsAlwaysTrue() && 42); // 42

// Continues when the value is False/Null/0/Undefined
alert(IsAlwaysFalse() || 42); // 42
alert(IsAlwaysTrue() || 42); // True

// Always continues
alert((IsAlwaysFalse(), 42)); // 42
alert((IsAlwaysTrue(), 42)); // 42

Scope labels

It is possible to put code in a specific scope and name the scope. By doing so it allows breaking out of a specific scope using the 'break' keyword.

myLabel: {
alert('I am shown :)');
break myLabel;
alert('I'm not :(');

Multiple lines

And a small trick to end this article with: Use the \ character to allow using multiple lines in your string definitions. Just put them on the end of the line like this:

var x = "abc\

Note that it doesn't actually add a linebreak in the string. You still have to use \r\n for that to happen.


Like C-Sharp, JavaScript has a lot of hidden gems. This is just a small selection of the lesser known things I know about. Do you have some awesome tricks in JavaScript worth a mention? Let me know!

The art of redesigning a website

Most long running sites get to a stage that they start to feel old, and really need a facelift to keep up with competitors. This is usually seen as fun, difficult but often unrewarding work. Is it the change what is frustrating people, or something else? In this article I'll give a few guide lines to follow when redesigning a website and keeping your visitors happy.

Make learning the new design as easy as possible

A phrase often heard when releasing a new site is "The people who are complaining don't like change, just wait till they get used to it". This is somewhat true: Regular visitors learned how to use your site, like the location of the buttons they often use. They also probably liked the site the way it was (or else they weren't regular visitors). Suddenly their knowledge is useless and they have to start learning the site again. People don't feel happy about that.

You can counter this by keeping the same menu structure, put the more often used buttons/links at the same location as before. Basically: Look at how your visitors browsed before the redesign, and see if that same way of browsing can be applied to your new design as well.

Wait with new features

A new design usually comes with new features. But introducing new features at the same time as showing a new design will make the visitor feel overwhelmed. Like in the previous point: You don't want visitors to spend too much time learning your site again. Also your new features will not get full attention from your visitors at that point.

Slowly introduce new features after a while, after you released the redesign. Preferably talk about the features in news or blog articles (before and after the release) and provide enough explanation. Another good thing about this method is: Your site looks more active and maintained that way.

Don't surprise your visitors

Often a new release is just released without most of the users even knowing it was going to come. A big release like that sounds fun but your visitors will just hate it. A regular visitor gets attached to a site and especially with community driven sites, they feel like they're part of it (and they are!). Decide on a whole new design without telling or asking feedback from your users is just kinda selfish.

Let your visitors know that you are working on a redesign. Let people test it and find out what they dislike before making the old layout unavailable. Make them feel involved with the new layout. People use your site in many ways and they will always surprise you.

Keep control of your designer urges

As a designer you get inspired by great layouts and features appearing around you. New ways of scrolling and retrieving data, big screaming titles and fancy new data visualizations. Redesigning a site and adding all those fun things will surely make your site fit in with all the other sites. But those gimmicks can be seen as annoying by your regular visitors.

Only add gimmicks to your design if they actually are useful for your site. Also make them subtle. People don't care about a big fancy animating tag cloud. They just want to focus on the content.


A good redesign is more than just putting together some nice looking pages. Think about your visitors and find out how they use your site. Keep them involved, notice problems and fix them. Make it as easy as possible for them to learn the new site. Losing visitors because of a redesign is a real issue: People will search for alternatives and some even build their own.

Awesome little C# tricks

C# is a very powerful flexible programming language and the .NET standard library offer tons of great functionality ready to use in your code. As a developer, I always look for new ways to write code quicker, make it more readable and faster, or just more fun. Here are some lesser-known tricks I've found and/or used myself.

Null checking

string result = variable1 ?? variable2 ?? variable3 ?? String.Empty;

You can use the ?? operator to check if a value is null and set a default value then, which is neat of its own. But you can also chain it to also check for other variables before resulting in a default value. Also, in C# 6 there is a new ?. operator you can use to quickly check for child properties:

string result = variable1?.childproperty?.childproperty ?? String.Empty;

Parsing a formula

var result = Convert.ToDouble(new System.Data.DataTable().Compute("(3*3)+1", null)); // 10

By using a bit of a creative trick of a method of the DataTable, you can parse simple formulas in your application. This can be useful when you want to support editable formulas in your application. For example to allow a reseller to customize their price based on an original prize: originalPrize * 1.2 + 1.

Making aliases for type names

using MyDictionary = Dictionary<string, List<int>>;

Put the above code outside your class declaration, and you'll be able to use "MyDictionary" as an alias. This can be useful when working with generics. They tend to make code unreadable quickly.


Sometimes when retrieving some specific information from the database, you want to cache it in case it needs to be requested again. But you don't want to store it forever in your memory. For example: Information about the current user doing a website request. By using the WeakReference class you can initialize your class with user information and allow the .NET garbage collector to unload it at any time it wishes (for example when it needs memory). Note the example below requires C# 4.5.

public class UserInfo
public string RealName;

private static Dictionary<string, WeakReference<UserInfo>> Cached =
new Dictionary<string, WeakReference<UserInfo>>();

public static UserInfo FromCache(string identity) {
UserInfo userInfo;

if (Cached.ContainsKey(identity) && Cached[identity].TryGetTarget(out userInfo))
return userInfo;

userInfo = new UserInfo()
RealName = "" // Retrieve from database
Cached[identity] = new WeakReference<UserInfo>(userInfo);

return userInfo;

Example usage


Implementing operators in your class

You can implement the standard Math operators in your C# class. It looks like something idiotic to do, but it can actually turn out to be really useful. For example this is done with the default DateTime struct:

TimeSpan difference = new DateTime(2015,01,01) - new DateTime(2014,01,01);
// difference.TotalDays is 365

Another use case would be to quickly throw a few groups of people together like I've did in this example:

class Program
static void Main(string[] args)
var town1 = new GroupOfPeople("A");
var town2 = new GroupOfPeople("B","C");
var bothTowns = town1 + town2;
// bothTowns.Names contains "A", "B", "C"

public class GroupOfPeople
public string[] Names;

public GroupOfPeople(params string[] names)
Names = names;

public static GroupOfPeople operator +(GroupOfPeople a, GroupOfPeople b)
return new GroupOfPeople()
Names = a.Names.Concat(b.Names).ToArray()


The MethodImplAttribute attribute provides some useful things you can do. You can put it above Methods and Property accessors.

By using [MethodImpl(MethodImplOptions.Synchronized)] you can make methods and properties thread-safe (Fields, those without {get; set;} are already thread safe). In C# 3.5 and earlier, it does a lock(this) { }, however in newer C# versions it uses a different implementation without locks making it even faster to use. The use of this attribute can be compared with using the synchronized keyword in java.

public class Bank {
public double Balance
private set;

public void AddTransaction(double amount)
if (amount + Balance < 0)
throw new Exception("Not enough balance");
Balance += amount;

To really micro-optimize your method calls, you can use [MethodImpl(MethodImplOptions.AggressiveInlining)] to make the compiler move a method implementation to the code calling it (reducing a method call). This can be useful for those small methods which needs to be called a lot of times, for example: When drawing frames in a game. Note that this won't save you that much time, but it is a fun trick nonetheless.

static double StartPosWhenCentering(double containerSize, double childSize)
return (containerSize / 2) - (childSize / 2);


There is a lot of nice gems you can do with C#. Knowing a few of them will definitely make your days as a C# developer happier. Let me know if you also know a few tricks.

Speed tricks for web development

Speed is important on the web. It is one of the most important things for a new website to be accepted and remembered by the visitors. Especially when coming from a search engine, a site loading a few seconds can result in the visitor leaving and checking out another result. Existing visitors can get frustrated by slow sites. This list will hopefully give you some ideas towards some speed improvements you can do on your (new) website.

Take a better look at your requests

A common practice for speed optimization is: Reducing requests. But take a better look and see if you can optimize them as well.

Are those requests going to different domains? Your browser has to build up a connection with each of them, this means: Resolving the host name at a DNS server, do a TCP handshake (and optionally a SSL handshake) followed by the actual request. Note that in old browsers, there was a low connection limit for each domain, so splitting them out over multiple hosts was actually beneficial because your browser will use parallel connections to retrieve content. Nowadays the limit has been increased. Always see if content is worth hosting it yourself. It will also give you more control on the caching headers and other optimizations. Using a CDN can be useful as they are usually located more nearer to the visitor and are optimized for serving content quick.

How is the static content served? Does it has the proper HTTP caching headers? Last thing you want is a browser requesting the same logo, css and javascript files for every request a visitor does. You can even think about allowing some pages to be cached on HTTP level if they never change. For Miageru I use this trick for most pages including the main page (with the big shiny search box), resulting in a very fast user experience.

Serving static html pages is no shame

Some website which don't have a lot of often updating dynamic content (like a blog) can be easily be saved as simple html pages. You can see this as some form of caching: Everything a new post is added, the pages are generated once and saved as raw .html file, instead of generating the same page every time a request is made. The core web technology is already build and optimized for serving simple static files. It takes almost no resources at all for a web server to serve them meaning you suddenly host your site on cheaper or more power efficient saving hardware.

Optimize this trick by throwing in some extra HTTP caching headers when you know your site (or a part of it like an old article) gets rarely updated. Now your web site can be cached by proxy servers and visitor browsers as well.

Make your site useful even while loading

Make your site as useful as possible even when partially loaded. Don't insert a bar at the top of the page (pushing everything down) when loading is finished. Do you need that bar? Overlay it or reserve the space already. Images can already be given the right size using width/height or inline CSS before they are loaded. Reducing the amount of 'jumping' will make any visitor a lot happier, especially the ones on a mobile device.

Don't retrieve your main content using Ajax. A browser will have to load the page and all resources, run the scripts, and then realize it has to do another request. Of course, loading content when interacting (when scrolling for example) can be beneficial: The browser only has to do a single request and does not have to run all scripts again. A small note when making navigation possible using Ajax: Use the window.location.hash property to allow back/forward navigation in your site.

Reduce the amount of resources a browser needs

Resources are all loaded in memory by the browser, including all images. Those have to be accessed when rendering (scrolling, switching between tabs) and unloaded when navigating. A lot of resources usually mean: Your site will feel sluggish on a slow device. Don't use very large photo's if you're only going to display them small and don't load too many photo's at once.

Do you really need that jQuery script just to show or hide a text on the page? I wrote about it before, libraries aren't always useful. See if you can reduce them.

Server side page generation

Some pages can't be cached and always need to be generated on the fly. Still those pages can be optimized by looking at the code. Reduce the amount of work and round trips the code has to do. Any time your code wants to retrieve something from a database it has to create (or re-use) a connection, send the data, the database server has to parse the query, generate an execution plan, execute it (including retrieving data from usually multiple places on a hard drive), return it and then your code can continue. Of course there is some caching in the database engine and storage mechanism but when having multiple queries it still quickly adds up. Combine multiple queries into one or even better: Load the results you need in memory (for example with Memcached). It can also be worthwhile to see if you can use another database type to store your data like NoSQL.

Optimize the amount of work your code has to do by moving work to an earlier stage, for example during the initializing of your website or in a cron job. For the search feature on Miageru the sites creates a key->page dictionary in memory when initializing and refresh it when needed. That way it can quickly look up results (including the auto complete ones) without having to ask the database server. The data in the charts in the Learn Center are generated once a day by a cron job.


Think what a browser has to do step by step and reduce those steps. This will make your site feel quicker than those of your competitors, making it more likely for visitors to choose your site above theirs (if your site has the right content of course, speed is not everything).

Overview of Japanese learn sites

While trying to learn Japanese, I've encountered a few different sites offering information and practices for people learning Japanese. In this article I'll describe some of them, say what they offer and give my opinion about them.


An information site made by the people from the popular blog Tofugu. This site is mostly aimed towards people starting off with Japanese. The site gives some general advice for when studying a language, and slowly introduces new components. The site is easy to follow and has a lot of explanations and examples. However it only covers a small part of the grammar.

This site relies on flashcards using Anki for practices and doesn't offer any other way of practicing other than using their other site: WaniKani. TextFugu is only free for the first chapter.


This site (which is suprisely still in closed beta) is also created by Tofugu. This site is aimed towards learning Particles, Kanji and Vocab and offers an interactive way to learn and review them (SRS method). The site uses funny/weird written mnemonics to quickly learn new items. The site does feel like an one-trick pony, you will get good at recognizing Kanji but you won't learn anything about grammar. Also some explanations given are vague ("Usually this means.."). It does offer a good community and the amount of content is big.

The site itself is free for just the first levels, which is enough to give you a decent feel for the website.


Known for their audio podcasts. It has a lot of good content hidden in a somewhat messy site. The audio files are of good quality, but they are uploaded in collections which sometimes overlap. It also offers a transcript and a small review practice for each audio lesson. Also included is a library with grammar explanations (in advanced English and with limited examples). It offers lists with introduction level vocab. The site has multiple membership levels including a free level with just a few things available. A little fun feature is The Word of the Day which can send you an email with a vocab every day.

A big warning for this site: The site really likes sending mail. As soon as you subscribe, you will get spammed with discounts and offers for their non-free membership programs (even if you are already a member). Also: Never get a membership full price, there is always big discount.


My own site, but also a site I personally use for my own study so I have to include it in this list as well. The main site offers information about a wide selection of common Japanese Grammar, Vocab and Kanji. It also has a list of example sentences.

The learn center (free for 2 weeks) offers interactive practices to learn and remember Grammar, Vocab, Kanji and Sentences. After the introduction lessons and practices, you get freedom to select sentences you want to learn, the site will make sure you learn all the grammar and vocab in those sentences. It also features regularly reviews (SRS method). The site is able to run offline as well, including on your mobile.

Tae Kim's Guide to Learning Japanese

Probably the most complete site you will find online about the Japanese grammar. It also has a few small exercises and it is free. The content itself is fast paced but it does include examples. There is also an app available.

Tanos (JLPT Resources)

A good free overview of the Kanji, Vocab and Grammar required for each JLPT level, including audio files. When learning for the Japanese-Language Proficiency Test this site can be used to review that you actually learned every bit you have to know before doing the test.


This is a site recommended to me, and after trying it I had to add it to this list. On this free site you can quiz yourself on Kanji and Vocab knowledge. You will score points by answering questions right. Great for people who need some extra motivation. There is also an iOS app available.

And many more

This was just a small list of resources I'm familiar with, but there are of course many more sites aimed towards people learning Japanese. Just poke around on Google, look up videos on YouTube and avoid sites which claim you'll learn Japanese with no effort at all. If you also have an interesting site you want me to take a look at, feel free to contact me.