Is XML in decline?

I happen to be one of those older software developers who saw the rise of XML. I even remember the older SGML standard, although I never used SGML. Version 1.0 of XML became an official standard in 1998. Once it became a standard, many companies started working to create the Killer App to work with XML without much of a hassle. And although at first many companies started to create their own XML parsers, not all of them were completely conform the standard. Those parsers disappeared fast enough too.

Right now, version 1.1 of XML is the latest standard. Yes, in 16 years not much has happened to this standard. And the changes that have been applied are more about supporting EBCDIC platforms and the newer Unicode definitions. There are discussions about a version 2.0 but it’s not likely to become a standard soon. Strange as it might sound, XML seems to be in decline if you look at how it’s used.

The power of XML was, of course, in the way how you defined these files and how you could do transformations on these file types. While we used DTD definition files at first to define the structure of an XML file, some smart people came up with the XSD schema format, which allowed more flexibility and is by itself an XML file. Combined with some nice, graphical tools, the XSD made it easier to define an XML file and to validate if an XML file conforms to the proper structure. And I’ve made plenty of XSD files between 2000 and 2010 since my work required a lot of XML data exchanges.

Of course, transformations are also important and here we use stylesheets. An XSLT file would be made in XML itself and define how you would convert an XML file to some other output format. In general, this output would be another XML file, an HTML document to display it in a web browser, a simple text file or even a comma-separated file. And in some special cases it could even create a complete rich text document that you could open in Word. This meant that you could e.g. send an XML file to a server and the server would then process it. It would validate the file with a schema and could do additional validation tools by using a style sheet. If it passed these validation style sheets, other stylesheets could then be used to extract data from the XML and send it to other servers for further processing, while it could also generate documentation to return to the user. You could do a lot of processing with just XML files.

Of course, XML also became popular because more developers started to create web services. And they used the SOAP protocol for this, which is a slightly complex protocol that’s heavily dependant on XML standards. Since SOAP also had some build-in version mechanism, you could always make sure if the client was still using the right SOAP definitions or not. You could even use several SOAP message formats on the same system with only the version number as difference. It wasn’t easy to set up, but it worked extremely well.
And more has been developed to support XML even more. The XPath expressions would allow you to point to specific elements within an XML document. With XQuery, you could execute queries on XML files and process the result. With namespaces you could even combine multiple XML definitions that uses similar entities. And then we have things like XLink, XPointer and XForms, which never have been very popular.

Between 2000 and 2010, it seemed that XML would be a dominating development technique. No more writing code in other programming languages that needed to be compiled, simply because XML happens to be a fast scripting environment. Many platforms started to have a standard for objects that could process XML files and knowledge of XML became a hard-needed requirement for developers. So, what changed?

Well, many developers consider the XML format a bit bulky, especially because tags are often used twice. Once to open the element and once to close it. Thus, if an element is called ‘NumberOfElements‘ then you have to write <NumberOfElements>10</NumberOfElements> and that’s a lot of text to store the number 10. As a result, some developers would then shorten those tag names so the resulting XML would be smaller. If you have 10,000 of these tags in your XML file, shortening it to TOE would save 26 characters per element, thus 260,000 characters in total. This doesn’t seem much but developers feel they gain more by these kinds of optimizations. With modern multi-core processors and systems with 8 or more GB of RAM, such optimizations might make the code half a second faster, which you barely notice with web services, but still… Developers think it saves a lot. And yes, when resources are truly limited, it makes a lot of sense but modern mentalities are that companies will just add a second server if one is too slow. Or more, if need be. This is because the costs of the more hardware is less expensive than the costs of having developers optimize the code even further.

These kinds of optimizations make XML files less human-readable while the purpose was to make this kind of data more readable. It becomes slightly worse when the XML file uses namespaces, since those namespaces are also shortened to just a few letters.

Another problem is the need to parse XML to extract the data. More and more companies are creating web applications that run within web browsers and heavily rely on JavaScript. These apps need to be able to run on multiple devices too. Unfortunately, not all browsers support parsing XML files and even those who do are a bit complex to use. With regular expressions it’s still possible to extract some data from the XML but if you need to fill a grid with 50 rows and 20 columns, things become real complex. And to solve this, developers started to send data to web applications as JavaScript instead of XML. This could then be executed and thus the data would load itself into memory. Since JavaScript objects are less bulky than the begin/end tags of XML elements, it made this new format very practical and thus JSON was born.

The birth of JSON also demanded a change in web services. Since web applications would call these services directly, it would be very clumsy if they have to set up SOAP messages and then parse the SOAP results. A newer, simpler style of web services arose, which uses the REST protocol. Of course, there are many other web service protocols but REST seems to become the new standard. Especially because it’s a simpler protocol that relies on the HTTP(s) protocol.

Of course, web applications have become more important these days because we’re getting more and more devices with all kinds of different operating systems, which all have web browsers. And, as I said, not all of those devices have a native XML parser built-in. They do support JavaScript though, and as a result it becomes quite easy to develop web applications for all devices which use data in JSON formats.

Of course, many devices also allow special platform-dependant apps that can be created with development tools for their specific platforms. For OS X and iOS-based devices you would use Objective C while you would use C++ or Java for Android devices. (Java is the preferred development platform for Android.) For Windows RT you would use .NET for Metro-style applications with either VB or C# as primary language. This makes it a bit difficult to develop software that runs on all three devices but there are several parties who have created compilers that will compile platformdependent executables from platform-independent code. Unfortunately, working with XML parsers still differs on all these platforms and those third-party compilers need to wrap their parsers around the built-in parsers of the underlying platform. That makes them a bit slow.

Since the number of operating systems have risen since the market starts getting more and more new devices, it becomes more difficult to keep a single standard that’s supported by all those systems. And the XML standard is quite complex so the different parsers might not all support the same things. In that regard, JSON is much simpler since these are just simple assignment statements. And these assignment statements are based on the Java syntax, which also happens to be similar to the C++, C# and Objective C syntax. The only difference with these languages is the fact that JSON puts the field names between quotes too, which you can’t do inside these languages.

So, XML is becoming less useful because it requires too much work to use. JSON makes data serialization simpler and is less bulky. Especially when developers are more focussing on web applications and apps for specific devices, the use of XML is in decline in favor of JSON and other solutions. But there’s one more reason why XML is in decline. And this is something within the .NET framework that’s called LINQ.

LINQ was implemented as a separate library for .NET version 3.5 but has become popular since then. Basically, LINK allows you to support data in a structured object and use simple queries to, or to execute transformations on extract data from those objects. This would be similar to XPath and XSLT but now it’s part of your development language, allowing you more choice in functions that you can apply to the data. This is especially important for date fields, since XML doesn’t work well with date formats. LINQ actually makes extracting data from object trees quite easy and can be used on an XML document if you’ve read this document in memory in a proper XDocument or XmlDocument object. Thus, the need for XSLT to transform data has disappeared since you can do the same in C#, VB, F# or Oxygene.

The result is that .NET developers don’t have to learn about XML anymore. Their .NET knowledge combined with LINQ is more than enough. Since .NET also allows serialization to and from XML formats, it’s also quite easy to read and write XML files in .NET. You can import an existing XSD file into your .NET application and have it converted to code, but since most XML data starts as objects that need to be stored in XML before serialization, you will often see that developers just define the objects and include attributes to tell if the object and its fields are elements or attributes, and have the serialization library use these object definitions to serialize it to and from XML. Thus, knowledge of XML schemas is not a requirement anymore.

Because .NET development made the dependency on XML knowledge almost obsolete, the popularity of XML is in decline. It’s still used quite often, but the knowledge that you need to do practical things with XML with XML tools is disappearing. And similar things are happening on other platforms. Java and PHP also started supporting LINQ queries. And, as a result, those environments can work on structured objects instead of XML data. Thus, XML is only needed if the data needs to be sent to some other process and even then, other formats might be chosen too.

In fact, many developers are less concerned about the data format that’s used for inter-process communication. The system is handling this for them and they just use a specific serialization library that does the bulk of the work for them. XML isn’t really declining, but less developers need knowledge about the XML format since development tools have nice wrappers around them that allow these developers to use XML without even realizing they’re using XML. It’s not XML that’s in decline. It’s the knowledge about XML that is in decline…

Motivating developers…

One of the biggest problems for software developers is finding the proper motivations to sit behind the screen for 8 hours per day, designing and developing new code, new projects. It’s generally boring work that requires a lot of mental efforts. And the rewards tend to be just more of the same work the next day, and the day afterwards. Creating new code or fixing existing code is like working in a factory in an assembly line, just placing a lid on a pot which someone else will close, over and over and over.

But developing code is a mental job, unlike adding lids to pots. During physical jobs, your mind can wander around to what you’re going to do in the weekend, what’s on television or whatever else you have on your mind. A mental job makes that very difficult since you can’t think about your last holiday while also thinking about how to solve this bug. And thus developers have a much more complex job than those at the assembly line. A job that causes a lot of mental fatigue. (And sitting so long behind a screen is also a physical challenge.)

Three things will generally motivate people. Three basic things, actually, that humans have in common with most animals. We all like a good night of sleep, we all like to eat good food and we’re all more or less interested in sex. Three things that will apply for almost anyone. Three things that an employer might help with.

First of all, the sleep. Developers can be very busy both at home and at work with their jobs. Many of them have a personal interest in their own job and can spend many hours at home learning, playing or even doing some personal work at their own computers. Thus, a developer might start at 8:30 and work until 17:00. The trip home, dinner and meet and greet with the family will take some time but around 19:30 the developer will be back online on Facebook and other social media, play some online games or study new things. This might go on until well past midnight before they go to bed. Some 6 hours of sleep afterwards, they get up again, have breakfast, read the morning paper and go back to work again.

But a job that is mentally challenging will require more than 6 hours of sleep per day. So you might want to tell your employees to take well care of themselves if you notice they’re up past midnight. You need them well-rested else they’re less productive. Even though those developers might do a great job, they could improve even more if they take those eight hours of sleep every day. And as an employer you can help by allowing employees to visit social sites during work hours since it will help them relax. It lowers the need to check those sites while they’re at home. The distraction of e.g. Facebook might actually even improve their mental skills because it relaxes the mind.

The second motivation is food. Employers should consider providing free lunches to their employees. Preferably sharing meals all together in a meeting room or even a dinner room. Have someone do groceries at the local supermarket to get bread, spread, cheese, butter, milk, soda’s and other drinks and other snacks. While it might seem a waste of the money spent on those groceries, the shared meal will increase moral, allow employees to have all kinds of discussions with one another and increases the team building. It also makes sure everyone will have lunch at the same moment, so they will all be back at work at the same time again.

Developers tend to have lunch between 11:30 and 14:00 and if they have to get their own lunch, it’s not unlikely for them to just go out to the local supermarket themselves or to bring lunch from home. When they go shopping for lunch, they would be unavailable during that time. Of course, lunch time is their own time, but if you need them you don’t want to wait until they’re back from the supermarket. And another problem is that those employees will start storing food at work in their desk or wherever else they can store it. This could attract mice, and I don’t mean computer mice but those live, walking and eating animals.

If an employer provides the lunch and other snacks, this also means there’s a generic storage for food products. This storage is easier to keep up than the desks of developers. Besides, those developers now know their food requirements are satisfied during work hours thus they feel more comfortable.

The third motivation is sex. And here, employers have to be extra careful because this is a very sensitive subject. For example, a developer might spend some time on dating websites or even porn sites. Like social websites, a small distraction often helps during mental processes but a social website might take two minutes to read a post and then respond. A dating website will take way more time to process the profiles of possible dating partners. A porn site will also be distracting for too long and might put the developer in a wrong mood.

The situation at home might also be problematic. An employee might be dealing with a divorce which will impact their sex lives. It also puts them back into the world of dating and thus interfere in their nightlife a bit more. This is a time when they will be less productive, simply because they have too much of their personal lives on their minds. And not much can be done to help them because they need to find a way to stabilize their personal lives again. Do consider sending the employee to a proper counselor for help, though.

Single developers might be a good option, though. They are already dealing with a life of being single and thus will be less distracted by their dates. Still, if they’re young, their status of being single might change and when that happens, it can have impact on their jobs. But the impact might be even an improvement because their partner might actually force them to go to bed sooner, thus fulfilling the sleep motivation.

Married developers who also have children might be the best option since their family lives will require them to live a very regular life. The care for their children will force this regularity. But the well-being of those children might cause the occasional distractions too. For example, when a child gets sick, the developer needs someone to care for the child at home. And they might want to work at home a few days a week to take care of their children.

As an employer, you can’t deal with the sex lives of your employees at work. Those things are private. However, it can be helpful for employees if they can spend more time at home, in a private area, if they have certain needs in this regard. Allowing them to work at home would give them some more options. Since they don’t need to travel to work, they have more time available. If they decide to visit a dating site for half an hour, they could just work half an hour longer and no one would even know about it. If their child is sick, they can take care of them and still work too.

In conclusion, make sure your employees sleep well, give them free lunches and other snacks at the workplace and allow them to work at home for their personal needs. This all will help to make them more productive and allow them to improve themselves.

To Agile/Scrum or not?

The Internet is full buzzwords that are used to make things sound more colorful than they are. Today’s buzzword seems to be “Cloud solutions” and it sounded so new a few years ago that many people applied this term to whatever they’re doing, simply to be part of the new revolutions. Not realizing that the Cloud is nothing more than a subset of websites and web services. And web services are a subset of the thin client/server technologies of over a decade ago. (Cross-breeding Client/Server with the Web will do that.) It’s just how things evolve and once in a while, a new buzzword needs to be created and marketeers are now working on the next buzzword that should make clear the Cloud is obsolete. Simply because new products need to be sold.

Still, the Software Development World hasn’t been quiet either. In the past, a project would be completed through a bunch of steps. It would start with an idea that they would turn into a concept. And this concept would include all requirements for the project.  Designers would then be called to come up with some basic principles and additional planning. When they’re done, they start to implement things, which would include methods to integrate the project into existing products and basically writing all code. It would then be tested and once the tests are satisfying, the whole project could be deployed and the maintenance would start.

If the project had problems in one of these steps, they would often have to go back one step. (Or more, in rare occasions.) This principle is called the “Waterfall model” and it’s drawback is that every step could take weeks to finish. It generally means that you can only update twice per year. Not very popular, these days.

So, new ideas were needed to make it possible to create updates more often. It started with the Agile Manifesto in 2001 and it has become a very popular method these days. Most groups of developers will have heard about it and have started implementing its principles. Well, more or less…

Agile has just four basic rules to keep in mind:

Individuals and interactions over processes and tools.
Working software over comprehensive documentation.
Customer collaboration over contract negotiation.
Responding to change over following a plan.

That’s basically the whole idea. And it sounds so simple since it makes clear what is important in the whole process. Agile focuses a lot on teamwork and tries to keep every team member involved in the whole process. Make sure every member is comfortable with the whole process and basically, talk a lot with one another over the whole process. People tend to forget it, but communication is a key element between people.

Of course, whatever you publish should work, and work well enough so users don’t complain about crashing applications or lost data. You might be missing features that customers would like, but that should not be the main focus of the whole process. Keep it working and keep the customer happy.

Of course, since you’re dealing with customers, you will need to know what they actually want. It’s fine if the CEO decided that the project needs methods X and Y to be implemented but if all customers tell you they want methods A or B implemented, then either the CEO has to change his mind or the company should start looking for a new CEO.

And keep in minds that things change, and sometimes change real fast. It’s hard to predict what next year will bring us, even online. Development systems get new updates, new plug-ins and new possibilities and you need to keep up to be able to get the most out of the tools available.

So, where do things go wrong?

Well, companies tend to violate these principles quite easily. And I’ve seen enough projects fail because of this, causing major damage or even bankrupt companies simply because the company failed at Agile. Failure can be devastating with Agile, since you’re developing at high speeds. And we all know, the faster you go, the harder you can fall…

Most problems with Agile starts with management. Especially the older managers tend to live in the past or don’t understand the whole process. Many Scrum Sprints are disrupted because management needs one or more developers from that sprint for some other task. I’ve seen sprints being disrupted because a main programmer was also responsible for maintaining a couple of web servers and during the sprint, one of those servers broke down. Since fixing it had priority, his tasks for that sprint could not be finished in time and unfortunately, other tasks depended on this task being ready.

Of course, the solution would be that another team member took over this task, but it did not fit the process that the company had set up. This task was for a major component that was under control by just one developer. Thus, he could not be replaced because it disturbed the process. (Because another developer might have slightly different ideas about doing some implementations.)

Fortunately, this only meant a delay of a few weeks and we had plenty of time before we needed to publish the new product. We’d just have to hurry a bit more…

Agile also tends to fail when teams don’t work well together. Another company had several teams all working on the same project. And unfortunately, the project wasn’t nicely divided in pieces so each team had its own part. No, all teams worked on all the code, all the pieces. And this, of course, spells trouble.

When you have multiple teams working on the same code, you will often need an extra step of merging code. This is not a problem is one team worked on part A and the other on part B. It does become a problem when both teams worked on part C and they wrote code that overlaps one another. Things will go fine when you test just the code of one team but after the merge, you need to test it all over again, thus the whole process gets delayed by one more sprint just to test the merged code. And it still leaves a lot of chances for including bugs that will be ignored during testing. Especially manual testing, when the tester has tested process X a dozen of times already for both teams and now has to test it again for the merged code. They might decide to just skip it, since they’ve seen it work dozens of times before so what could go wrong?

As it turns out, each team would do its own merging of the code with the main branch. Then they would build the main branch and tell the testers. Thus, while testers would be busy to test the main branch that team 1 provided, team 2 is also merging and will tell them again, a few days later. The result is basically that all tests have to be done over again so days of testing wasted. Team 3 would follow after this, thus again wasting days of testing. Team one then decides to include a small bugfix and again, testing will have to start from the beginning, all over again.

With automated testing, this is not a problem. You would have thousands of tests that should pass and after the update to the main branch, those tests would start running from begin to end. Computers don’t complain. However, some tests are done manually and the people who execute those tests will be really annoyed if they have to do the same test over and over with every new build. It would be better if they’d just try to automate their manual tests but that doesn’t always happen. So, occasionally they decide that they’ve tested part X often enough and it never failed so why should it fail the next time?

Well, because team 1 and team 2 wrote code that conflicts with one another and that code is in part X. The testers skip it, thus the customer will notice the bug. Painful!…

There are, of course, more problems. I’ve seen a small company that had a nice, exclusive contract with a very big company. Lets call them company Small and company Big. Company Small had created a product that company Big really liked so they asked for an exclusive version of it, with features that company Big would choose. And this would be a contract that would be worth tens of millions for company Small and its ten employees.

And things would have gone fine if company Small had not decided to continue working on its own products and just focused on delivering what company Big wanted, and to deliver in time. But no, other things were more important and the customer would just get what company Small made, with some minor adjustments. And the CEO was quite happy with this progress. That is, until the customer noticed that they did not hear his wishes. All company Big was supposed to do was sign the contract and pay the bill. And once things were done, they would just have to accept what was given to them. So company Big found another company willing to do the same project and just dumped company Small. End of contract and thus end of income, since company Small just worked exclusively for the bigger company. And within five months, company Small went tits-up, bankrupt. Why? Because they did not listen to the customer, they did not keep them happy.

And another problem is the fact that companies respond very slowly on changes. I’ve worked for companies that used development tools that were 5 years old, simply because they did not want to upgrade. I still see the occasional job offering where companies ask for developers skilled with Visual Studio 2008 while there are three newer versions available already. (Versions 2010, 2012 and 2013.) In 2003 I was still working on a 16-bit project that was meant to be used by Windows 3.1 and up, simply because one single user still used an old Windows 3.11 system. At least, we thought they did because no one ever asked them if they’ve upgraded. And that customer never told us that they had indeed upgraded and didn’t think of asking for a 32-bit version…

I’ve seen management hang on to a certain solution even though there’s plenty of evidence that newer options are available. I’ve developed software on 32-bit systems with 2 GB of memory when 64-bit systems were available and had up to 8 GB of memory, plus more speed. I had to use a single-monitor system on a PC that had options for multiple monitors plus we had extra monitors available, but management considered it a waste. The world is changing and many systems now easily support two or more monitors but some companies don’t want to follow.

So, what is Agile anyways? It’s a method to quickly respond to changes and desires of customers with a well-informed team that feels committed to the task and to deliver something the customer wants. (And customers want something they can use and which works…)

Would there be a reason not to use Agile? Actually, yes. It’s not a silver bullet or golden axe that you can use to solve anything. It’s a mindset that everyone in the team should follow. One single member in the team can disrupt the whole process. One manager who is still used to “the old ways” can devastate whole sprints. When Agile fails, it can fail quite hard. And if you lack the reserves, failure at Agile can break your company.

Agile also works better for larger projects, with reasonable big teams. A small project with one team of three members is actually too small to fully implement the Agile way of working, although it can use some parts of it. Such a small team tends to make planning a bit more difficult, especially if team members aren’t always available for the daily scrum meetings. When you’re that small, it’s just better to meet when everyone is available and discuss the next steps. No clear deadlines, since the planning is too complex. What matters is that goals are set and an estimation is made when it is finished. Whenever the team meets, they can then decide if the estimation is still correct or if it needs to be adjusted.

Another problem can be the specialists that are part of the team. Say, for example, that you have a PHP project that needs to communicate with a mainframe and some code written in COBOL. The team might have hundreds of PGP developers but chances are that none of them know anything about COBOL. So you need to have a COBOL specialist. And basically he alone would carry the tasks of maintaining the mainframe side of the project. You can make him part of the Scrum meetings but since he has to do his part all by himself, he doesn’t have much use for the other team members. So again, just decide on a specific goal and estimate when it should be finished. Get regular updates to allow adjustments and let the COBOL developer do his work.

The specialist can become even more troublesome if you have to interact with a project that another company is creating. If you do things correctly, you and the other company would discuss a generic interface for the interaction between both projects. You would then both build a stub for the other company to use for testing. This stub just has to offer some dummy information, but it should be usable.

When both companies have the stubs they need, they can each work on their part. They will have to keep each other informed if some parts of the interface need to be changed or if some rules are changed about the data that can be provided. Preferably, this is done by providing a new stub. Both teams will have just one goal, which is providing all the required methods that are part of the stubs. And when parts are fully implemented, they can offer the other company with new stubs that contain some working parts already.

Still, when two companies have to work together this way, they have to think small. Don’t create a stub with thousands of methods for all the things you want to add during the next 5 years. Start small. Just add things to the stub that you want to finish for the next sprint. Repeat adding things per sprint and communicate with the other company about what they’re going to add next. You don’t have to work on the same method of the stubs anyways. One company might start working on the GUI part that allows users to enter name, address and phone number while the other works on storing employment data and import/export management. The stubs should just give dummy methods for those parts that aren’t implemented yet. Each company should develop the parts that they consider the most important, although both should be aware that everything is finished only if all stub methods are implemented.

Agile is just a mindset. If used properly, it can be very powerful. However, do keep in mind that not all of Agile might be practical for your own situation. Agile requires a lot of time for meetings with developers, with customers and with management. Everyone needs to be involved and everyone needs to be available for those meetings. Scrum becomes more difficult if not all team workers are available on all five workdays of the week. And worse of all,, team members will have to prepare for the meetings. Even for the daily meetings since they have to keep track of their own progress.

Do not fear to just implement part of the whole Agile/Scrum principle. It is made to hybridise with other methods. Use the methods, don’t let the method force itself upon you.

The FBI in Lithuania wants to pay me 15 million dollars…




I do love some of the spam messages I receive. Especially when the spammers try to pretend they’re the FBI or other important organisation and they want to pay me a few millions. And I can’t really imagine that some people are stupid enough to fall for this. Then again, if they send 5 billion of these messages, the chance is quite big for them to find an idiot or two willing to fall for this.

Those people must be even more brain-dead than the spammers…SpamThis is not a very expensive scam. They just ask for 420 USD instead of thousands of dollars. A payment for the ownership papers or whatever. And they tell me to stop being in contact with the other scammers, which is very good advise.

So? Well, it starts with Mrs. Maria Barnett from Canada. The address seems real, although it has been misused by plenty of other spammers. The address is actually used by an organisation with domain name and is registered by Joseph Sanusi. Too bad that name sounds a bit suspicious since there’s someone in Nigeria with the same name. (The governor of the Central Bank of Nigeria.) He is 75 and I don’t think he’s the spammer, so someone else either has the same name or they’re faking things even more. The domain name is registered but doesn’t seem to be linked to any site or server, because it’s pending a deletion.

Then they refer to Mr. Fred Walters of the FBI. Fred helped Maria to get their money from some Nigerian bank, and they got even a lot more. He even showed her a list of other beneficiaries and my name was on the list and I am eligible to get lots of money too. All I have to do is contact Fred on the email address of Steve Reed in Lithuania, who seems to work at, which is a Lithuanian website. I don’t really understand the language but Google Translate does. It seems to be an online book store. A strange place for the FBI. I would expect the CIA in that place instead.

Maria herself seems to work for Shaw, a Canadian internet shop. They sell televisions, phones and other stuff. So we have two shops in two different countries that are somehow related by some victim of a Nigerian 419 scam and a FBI agent.

Now, the email headers, visible at the bottom, show some more interesting connections. For example, I notice the name ‘’, another chain in the spiderweb of the scammers. That domain is also pending deletion too. The IP address seems to be down too, so it’s likely the scammers have already been taken down.

But this spam message just shows how dumb the spammers make their requests and yet people keep falling for it. If the story was more logical and the email addresses and domain names had actually been more real  then I could understand why people fall for this. But this?

Delivered-To: ********@********.***
Received: by with SMTP id w9csp17960igz;
        Sat, 1 Feb 2014 05:42:38 -0800 (PST)
X-Received: by with SMTP id p11mr1777051igx.19.1391262158192;
        Sat, 01 Feb 2014 05:42:38 -0800 (PST)
Return-Path: <>
Received: from ([])
        by with ESMTPS id x1si3519252igl.27.2014.
        for <********@********.***>
        (version=TLSv1 cipher=RC4-SHA bits=128/128);
        Sat, 01 Feb 2014 05:42:38 -0800 (PST)
Received-SPF: softfail ( domain of transitioning does not designate as permitted sender) client-ip=;
       spf=softfail ( domain of transitioning does not designate as permitted sender)
Received: from User (unknown [])
    by (Postfix) with ESMTP id 02525A7FA30B;
    Sat,  1 Feb 2014 06:57:03 -0500 (EST)
Reply-To: <>
From: "Mrs. Maria Barnett"<>
Subject: Make Sure You Read Now.  
Date: Sat, 1 Feb 2014 06:57:10 -0500
MIME-Version: 1.0
Content-Type: text/html;
Content-Transfer-Encoding: 7bit
X-Priority: 3
X-MSMail-Priority: Normal
X-Mailer: Microsoft Outlook Express 6.00.2600.0000
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2600.0000
Message-Id: <>
To: undisclosed-recipients:;

One more spammer caught…

Well, it seems that a message about spam attracts other spammers. Fortunately you can also report spammers who try to spam through comments at SpamKlacht. And if the spammer or company mentioned by the spammer is located in the Netherlands, then they can take actions against them.

So, let’s display part of the report at the end of this post that I’ve received from SpamKlacht, which happens to be written in Dutch. (Sorry, but maybe Google Translate can help?)

In short, a french website has posted a Dutch message on a blog that’s mostly written in english. It’s likely that the servers from are hacked and misused to send this kind of spam. These spammers know that forum and blog spam is harder to trace and stop than regular spam by email. They also know that many blogs and forums don’t have very good systems against this kind of spam, although WordPress does an incredible job in stopping them.

What’s more interesting is that this message doesn’t contain an email address, phone number or even a URL to their own site. Most likely, that link would be or that of one of their members. It’s not really helping much, unless people like me decide to look for them by using Google.

What actually happens is that the spammers are smart. They just pick up random texts from the Internet, in this case the About-page from Euromovers, they just shorten some of the paragraphs and use the text as their comment, hoping it somehow makes sense for the forum or blog administrators to let it pass. They know that if an administrator passes one spam message, it’s likely that the spammers account has become whitelisted and thus is allowed to post more comments. When that happens, the spammer will flood the blog or forum with spam.

With WordPress, it’s actually a practical way to bypass the spam filters. Fortunately, even though my site operates under a dutch domain name, its main language is english. As a result, I tend to consider comments in dutch a bit suspicious. But I also learned to just trust it’s spam filter, which hasn’t failed me yet.

The report from SpamKlacht:

U heeft een spam-melding geplaatst op, een website van de Autoriteit Consument & Markt. Dit document geeft een samenvatting van uw melding.

Spamklacht gemeld op  : 20-01-2014 09:43
Uw gegevens
Naam  : W.A. ten Brink
Adres  : xxxxxxxxxx
Postcode / plaats  : xxxx xx Amsterdam
Telefoonnummer  : xxxxxxxxxx
Gegevens van het mogelijke spambericht
Bericht ontvangen per  : Social Media, namelijk
Ontvangen op datum / tijd  : 19-01-2014 13:53
Ontvangen op adres  : Spamfilter heeft het tegengehouden.
Ontvangen van adres  : Verhuisbedrijf Euromovers uit Vlaardingen
Genoemd adres  :
Onderwerp  : Het betreft een bericht dat in mijn spamfilter van WordPress terecht is gekomen. Het bestaat uit drie delen, te weten de auteur, het bericht en een URL naar het bericht waar de spammer het probeerde te plaatsen.

[Author start]
[Author eind]

[Bericht start]
…… Verhuisbedrijf Euromovers uit VlaardingenVerhuisbedrijf
Euromovers uit Vlaardingen maakt deel uit van
het internationale netwerk van Euromovers International.
Dit netwerk bestaat uit hoog gekwalificeerde en betrouwbare
verhuisondernemingen in geheel Europa, de VS, Rusland, China, Australië
en Nieuw Zeeland. In Nederland is elk…….Bent u opzoek naar een professioneel
verhuisbedrijf dat werkt met ervaren verhuizers, professionele materialen, zelf vervoer
op maat regelt en werkt met een goede motivatie aan elke klus?
Kies dan voor de Verhuisbeweging, hét ideale verhuisbedrijf van Rotterdam en
omstreken. Wij zijn een erkent verhuisbedrijf dat zich door de jaren heen
heeft bewezen als betrouwbare en professionele verhuizer, daarom hebben wij ook een schadeverzekering gekregen, dus mocht er eventueel schade oplopen tijdens het verhuizen, geen punt!
Onze verzekering dekt de schade en betaald het aan u uit!
[Bericht eind]

One more spammer: Adobe!

I like to use email aliases for every online subscription and registration I have to fill out. I like this because it allows me to recognise if companies are going to spam me or not. I also make sure that any checkbox for extra mails that is checked will be unchecked. Unfortunately, not all companies care about that.

One of them is Adobe, well-known from it’s PFD reader but I also happen to use Adobe Lightroom, which requires an online registration. Which I had to fill in, else I would not be able to use the software properly. Okay, so I did. And I used an alias.

Today, I received an unreadable email because the images inside are blocked by my mail reader.  They seem to have given or sold my address to who likes to spam many people with all kinds of garbage. I think they’re trying to sell me a DVD box in this message, but I’m not sure and don’t want to know. Viewing those images would mean that my mail reader has to contact their servers with a special code, and that code will validate my address.

I have reported it to SpamKlacht and I hope they will take action against this spammer and against Adobe. Adobe is just as guilty for not keeping my address safe. They violated my privacy by sharing that address with others.

I will show the headers of this email, though. And I hope most spam-filters will pick this up and add this spammer to the blacklist. They should blacklist Adobe too, in my opinion, because this pisses me off! I expect some small internet-companies will leak my address but Adobe is supposed to be a serious, big international company. They just don’t care about their customers, that is clear…

Delivered-To: xxxxxxxx@xxxxxxxx
Received: by with SMTP id bh4csp113728igc;
        Mon, 13 Jan 2014 00:38:24 -0800 (PST)
X-Received: by with SMTP id gc2mr1505781wjb.75.1389602303789;
        Mon, 13 Jan 2014 00:38:23 -0800 (PST)
Return-Path: <>
Received: from ( [])
        by with ESMTP id md15si7043232wic.62.2014.
        for <xxxxxxxx@xxxxxxxx>;
        Mon, 13 Jan 2014 00:38:23 -0800 (PST)
Received-SPF: pass ( domain of designates as permitted sender) client-ip=;
       spf=pass ( domain of designates as permitted sender);
       dmarc=pass (p=REJECT dis=NONE)
Received: from localhost (localhost [])
    by (Postfix) with ESMTP id 16895163B348
    for <xxxxxxxx@xxxxxxxx>; Mon, 13 Jan 2014 09:38:23 +0100 (CET)
DKIM-Signature: v=1; a=rsa-sha1; c=relaxed/relaxed;;
    s=default; t=1389602303; bh=Z5MpxKWITtojtkQ1ghnUMKSgLY4=;
MIME-Version: 1.0
Content-Type: multipart/alternative;
From: "Welkomstgeschenken Kies een Tablet" <>
Subject: U ontvangt de complete Penoza DVD box
List-Unsubscribe: ,<>
X-Slip-uID: 2011425
X-Slip-active: N
X-BeverlyMail-Recipient: xxxxxxxx@xxxxxxxx
To: xxxxxxxx@xxxxxxxx
Date: Mon, 13 Jan 2014 08:38:23 +0000
X-BeverlyMail-MTA: 74
Message-ID: <>


And thus ends the year 2013. A year that held a lot of changes for me. My former employer had financial problems resulting in me and a few colleagues becoming former employees. I’ve had an accident in which I injured my back which is still troubling me. My previous computer started having troubles and the hard disk crashed.

But also some good news. I bought an iPad, I bought a new, expensive laptop from Alienware. And I replaced my desktop with an expensive Alienware laptop.

I found a new job but unfortunately my back caused too much trouble so I had to quit again. For now, I just have to wait until my back is healed again.

I started fitness, to train the muscles in my back and I managed to lose some weight. I’ve changed my diet and am drinking more water and much less cola.

I’m walking more with my dogs, have started to read some books and am studying some interesting topics. I’m spending more time on electronics too just to expand my knowledge and to better understand what it is when I’m writing code. Because code is still abstract while the electrons moving through hardware is the real, physical world.

Amazing how electrons translate the keys I press on my keyboard into letters on the screen. It seems so simple but I’m starting to become awed by the whole complexity behind it all. I always knew it involved quite a few parts but I start to realize how many parts are actually involved in this all.

I’ve created more artwork in Poser and Vue and I have to say that my skills are improving. Still not very good but I’m happy enough about it all. Still, when looking at the images I’ve created this year I have to admit it’s less than earlier years. Better quality, higher resolutions, but less images. And the amount of clothes in my art also went down.

I’ve done a few new things, quit a few other things and when I look back, I realize that this has been a very busy year for me.

And now I wonder what next year will bring. Next year, which happens to be about 15 minutes in the future. There’s one thing I do know, listening to all the fireworks outside… It will start with a bang…

To all who read this:

Happy New Year!

One week of spam…

Yesterday, I posted about comment spam in blogs. Today, I’m going to mention a few topics of spam messages I’ve received in just one week. Ti begin, I’ve received an email from the “Microsoft Partner Awareness Team” who doesn’t seem to have a Microsoft mail account but some address in Nicaragua. The topic is “Confirm Receipt” and in it they tell me that they celebrate some 30th anniversary and as a result, this team is giving away £1,864,000.00GBP to six lucky recipients. And I’m one of them and need to reply with name, address, telephone number, email address and nationality. A nice example of phishing.

Next, a message about Canadian Pharmacy Online, where I don’t need prescriptions. Well, I don’t need these drugs either.

And a message from “WhatsApp Messaging Service” notifying me about a new voicemail, even though I don’t have a WhatsApp account for this specific email address. Since the sender is from Russia, I’m not interested in listening. Even though they’ve sent me this message twice…

The next one is a very good one, since it’s from the Google+ Team and uses as address. Seems legit, doesn’t it? Too bad Google Mail happens to be the same as GMail, so the spammer is using this free service to pretend to be Google. The attached PDF promises £ 950.000 to me as an award and all I have to do is fill in a form with name, address, telephone number, nationality, birth date, gender, occupation and email address. Definitely phishing!

Of course, most phishing emails will promise huge rewards to people, as the one I’ve received from Italy. Some investors have 375 million euro which they want to give away. These huge amounts just make it very clear it’s just fake.

Then some more pharmacy messages and other offers for all kinds of medicines and certain ‘blue pills’. Of course, this kind of spam is also very popular, apparently because one in a million people still decide to buy their drugs this way…

But there are more ways than offering money or selling drugs. I also received a spam message with a pretty woman in bikini. Her name is Valeriya and she lives in Russia and is rather shy at first. And she wants to be pen pals with me. Oh, my… Dating spam! Another trick to get people to offer personal details or even to trick them into sending money to this pretty girl. Or maybe just a fat guy who pretends to be a pretty girl, since that’s more common. Still, even if this girl was real, chances are that she’s just out to steal your wallet and everything else you have. By the way, Irina also wants to chat with me. She enjoys hiking and pottery.

Then an email in the German language offering me a method to win at roulette in some online casinos. Ah, the old gambling site spam. Fits with the other spam message which is written in Dutch and offers me a chance to win the jackpot. They even promise me 100 euro as a bonus when I subscribe. Or the one where they’ll give me 20 free lottery tickets while they claim I’ve officially subscribed to their mailing lists in the past. (Which I never did, since the specific account that received the spam isn’t used to subscribe to anything.)

Then some message which advises me which stocks I should buy on the stock market, since they’re about to become valuable. Sure, for the person who is selling them right now! If plenty of people start bidding, the price will go up from nearly worthless to a few pennies per stock. If they then manage to sell a million stocks, it’s easy money with a huge profit, in a way that’s mostly legal.

And sometimes you receive an email that looks just a bit gibberish, yet makes you curious. People tend to reply to those kinds of messages, asking the sender what’s going on here and what they meant by this message. And thus they confirm their email address is correct. And since many people add a signature to their emails, the sender will get to know a bit more about the recipient. If the recipient happens to work for some company and the company adds signatures, then the spammer might have enough information to pretend he’s that employee!

The emails from “USA TODAY News” are also interesting. Sent from an address, it provides me information about losing weight. Apparently I’ve subscribed to their newsletter too (NOT!) and I can unsubscribe and thus confirm the correctness of my email address. Strangely enough, the unsubscribe link points to a Russian website. USA Today seems to be in Russia?

In short, I have three email accounts on my domain and an infinite number of aliases on my domain and a few other domains. I also have two old GMail accounts that I barely use but in total, I receive about 20 spam messages per day over all accounts, which Google nicely detects and filters for me. They’re annoying but Google takes much of the annoyance away. Handy, because I also receive about 60 to 100 legitimate emails per day, mostly from mailing lists.

All these spam messages were easily detected by Google and you can wonder if spam is really as profitable as it seems. But it’s the magic of big numbers that’s in the favor of spammers. If they’re sending one million messages, and only one percent reads the message then it’s still read by ten thousand people. If only one percent of those are responding with some information then they’ve collected the information of 100 people. And if one percent of those fall for their traps and the spammers earns a few thousands of euro’s then they’ve probably made a nice profit.

Basically, people should not respond to spam. They should recognise what spam looks like, which is why I’ve written this post. Do not even open spam just to check the contents since your mail reader might already offer spammers with some information. I am a trained professional and I know what I’m doing when I check spam. My browser is set up in a secure way, my antivirus software is always up-to-date and I am really careful with spam messages and I avoid mail readers that might send information back to the sender. Then again, I have more than 20 years of experience dealing with malware, viruses and spam. Don’t expect that you can do that even someone with 20 years of experience tries to avoid! Because I think education is important but I would have preferred to throw away all those messages without even a single look!

And another stupid spammer…

Many people complain about all the spam in their mailboxes but when you’re running a blog, forum or even a simple contact page where visitors can leave messages, you can still receive spam in some other forms. With Facebook and Twitter, for example, you might get invitations by people you don’t even know. With LinkedIn, this is a bit more difficult but it still has people attempting to connect to you so they can make all kinds of “interesting” offers to you.

But today I’ve received a comment spam on my post called “Dealing with deadlines” and it started like this:

{I have|I’ve} been {surfing|browsing} online more than {three|3|2|4} hours today,
yet I never found any interesting article like yours.
{It’s|It is} pretty worth enough for me. {In
my opinion|Personally|In my view},if all {webmasters|site owners|website owners|web
owners} and bloggers made good content as you did, the {internet|net|web} will be {much more|a
lot more} useful than ever before.|
I {couldn’t|could not} {resist|refrain from} commenting.
{Very well|Perfectly|Well|Exceptionally well} written!|
{I will|I’ll} {right away|immediately} {take hold of|grab|clutch|grasp|seize|snatch} your {rss|rssfeed} as I {can not|can’t} {in finding|find|to find}
your {email|e-mail} subscription {link|hyperlink} or {newsletter|e-newsletter} service.
Do {you have|you’ve} any? {Please|Kindly} {allow|permit|let} me {realize|recognize|understand|recognise|know}
{so that|in order that} I {may just|may|could} subscribe.

Well, that’s an interesting comment. (Full text here…) Basically, this is a script file that’s used by spammers to create random comments for blogs and forums. And normally, spammers will just use a selection of words and sentences from these script files to generate something a visitor might have written. And the many variants make it harder to detect as spam. Unless you’re giving the master script, of course, like this stupid spammer has done.

If I would allow this message, someone with a Canadian IP address ( would be able to add more comment spams on my blog and might even flood fill it with spam, once they got their first approval. Of course, the spammer also used an email account ( from the German provider called Freenet and they have been used many times by spammers. They’ve taken steps to prevent spammers to send mass emails but that doesn’t stop spammers from doing comment spams like this one.

Also interesting is the fact that the spammer added a link to (Links to main site, not the spammers blog) which happens to be some blog on the site of an Indian company called “Six Sigma”. I wonder if this company even knows about this blog, that’s written in French. I guess they don’t know about it, but that their DNS information has been hijacked. Or maybe their servers are hacked.

So, what I like to do is visit RobTex to collect more information about what I’ve found. So far, it’s an interesting international spammer. Mail in Germany, spamming from Canada with a web server that’s owned by a company in India. RobTex tells me the shared host they use for the site is Enzu in the USA, which provides cloud services and more. They also use the DNS services of GoDaddy which does confuse me a bit. Why not use the DNS servers of Enzu?

Well, some further research tells me why. While Six Sigma uses GoDaddy as their host, the spammers have instead used Enzu to create their own website, which makes them appear legitimate. They’ve also moved the regular site to Enzu, and are probably redirecting visitors from there to the original website. (Or Six Sigma is supporting the spammer, which is also an option. I just don’t want to accuse them of this crime.) When I visit the Six Sigma website, it does seem as if someone has taken over control over their site. Much of it looks disabled, as if the hacker is just misusing the site for their own purposes. It looks like it’s been taken over two days ago by the hacker, yet they did not detect the hack at this moment. I hope they will be able to fix this fast, though.

Of course, there’s an even bigger risk here. Since the spammer seems to have hijacked their home site, he can play a man-in-the-middle attack. Every customer of them who enters their credentials to log in will tell this hacker about their credentials too. This is a serious thing. Spammers are often trying to do more than just send spam. They will try to collect more information to allow them to hack even more accounts.

There are a few things here that worry me. First of all, this Indian company that doesn’t seem to realize their site is hacked. Also, GoDaddy, who is supposed to be their host, isn’t hosting their main site. Also, Enzu doesn’t seem to realize that they’re hosting a site for an Indian company that uses the French language for a blog that seems filled with random articles from French/Canadian news sites. You could wonder if hosting companies should be able to check if strange things are happening to the accounts of their customers.

Yeah, I think you can blame hosting companies for all the spam on the Internet, simply because they’re not pro-active when suspicious changes are made to the accounts of their clients. If hosting companies take more care in selecting their clients, validating any account changes and don’t even tell their customers when their accounts seem to be hacked, then spam will just continue to cause problems.

Continue reading