When OpenOffice went to the Apache Software Foundation I started writing a post about this topic that I never got to finish and publish.
The post from my colleague Rob Weir on Ending the Symphony Fork prompted me to post this now though.
I should say that I no longer have anything to do with what IBM does with ODF and anything related. I’ve changed position within IBM in the Fall of 2010 and now focuses on other things such as Linked Data which I may talk about in some other post.
In fact, I’m now so out of touch with the folks involved with ODF that I only learned about OpenOffice going to the Apache Software Foundation when the news hit the wire. I had no knowledge of what was going on and have no insights as to what led to it. Similarly, I only learned after the fact about IBM deciding to merge Symphony with OpenOffice.
So, if anyone wants to blame me for speaking as a person on IBM payroll, I’m not even going to bother responding. This is entirely my personal opinion and I’ve not even talked about it to anyone within IBM.
But let me say quite bluntly what Rob is only hinting at: It’s time for LibreOffice to rejoin OpenOffice.
LibreOffice started from a sizable portion of the OpenOffice.org community being tired of Oracle’s control and apparent lack of interest in making it a more open community. I certainly understand that. But now that this problem is solved, what does anyone have to gain from keeping the fork alive?? Seriously.
While forks in the open source world can be a tremendous way of shaking things up, they can also be very damaging. In this case, I think it’s a waste of resources and energy to keep this going. Instead of competing with each other the LibreOffice and OpenOffice communities should get together to fight their common and real competitor.
I know a certain level of competition can be healthy but I’m tired of seeing open source communities fight with each other to their own loss.
I know the fork was painful and people still hold a lot of angst against one another but they need to get over that. They need to realize they would do themselves and everyone else a real service by putting all this behind them and uniting. LibreOffice should declare victory and join forces!
Presentation documents have become an essential part of our work life and our mailboxes are typically filled and even clogged with messages containing presentations. Some of these attachments are unnecessarily big and there is something very simple you can do to make them smaller. Here is what I found.
I was working on a presentation which had already gone through the hands of a couple of my colleagues when I noticed it contained 40 master slides, or templates, while only 3 of them were used and most of the others were just duplications.
From what I understand this typically happens when you paste in slides copied from other presentations. Usually what happens then is the software paste the slide in along with its template to ensure it remains unchanged. Given how common it is to develop presentations by pulling slides from different sources this can quickly lead to a situation with many templates, not always different.
I went on to delete all the useless copies I had in my document and saved the file to discover that its size had gone from 1.6MB to a mere 800KB. Even though disk space and bandwidth is getting cheaper every day I think anybody will agree that a 50% size reduction remains significant!
So, here, you have it. Tip of the day: To make your presentation file smaller and avoid clogging your colleagues mailbox unnecessarily, check the master view of your presentation and consolidate your templates.
Of course the actual size reduction you’ll get depends on your file. In this case the presentation contains about 20 slides, with only 3 slides including graphics.
For what it’s worth I experimented both with ODF and PPT using Lotus Symphony, as well as with PPT using Microsoft Powerpoint 2003 and the results were similar in all cases.
Just to clarify my post on the cost of wifi and wifi security, I’m not advising anyone to turn security off just because it may significantly be slowing their connection. For one thing, I didn’t.
Just like the old saying “safety first” goes, “security first” ought to prevail here. Indeed, there are several reasons for which you should bare the cost of security and keep it on no matter what.
If you need more speed, like I do for my media center, the solution for now is to use a cable and avoid wifi altogether. For a media center it’s not so bad given that I don’t really need to move it around, it’s just that there already are so many cables, one fewer would have been nice…
In the future, the upcoming 802.11n wifi standard should alleviate that problem by providing faster speed all around. You can actually already get products that support draft versions of the spec but I prefer waiting for the technology to stabilize.
The intent of my post was merely to highlight something that doesn’t seem to be getting much coverage on the web and which I think people should be aware of.
Also, I should note that both devices – the router and your computer – play a role in this. So, the loss in speed doesn’t necessarily only come from the router. The wifi card in your computer may be just as guilty. To figure this out you’d have to make tests with different computers, which I haven’t done (yet).
I break a long silence to write about something I just found out about wifi and wifi security. Admittedly it may not be an earthshaking discovery but having searched for info on the subject it doesn’t seem like it is covered much so it seems worth a blog post (plus, for once, I can give this higher priority than everything else.)
There is a lot of info out there on how to set up your home wifi and set it up to be secured. However, little is said about what this will cost you. I mean in loss of speed.
I did some tests over the weekend and here is what I found, using speedtest.net, with a cable internet connection:
Directly connected to the cable modem (no router, no wifi): ~23Mbps download
Connected via cable through my router (“Belkin Wireless G Plus Router”), no wifi: ~17Mbps download. Gasp, that’s a 25% loss right there. I’m no network expert so I don’t know if that’s normal but I sure didn’t expect to lose that much just going through the router. But that’s actually nothing. Read on.
Connected via wifi through my router, with an open connection, no security: ~14Mbps download. Ouch. Here goes another 18%. Unfortunately that’s not even close to be the end of it.
Connected via wifi through my router, with security set to WPA-PSK TKIP: ~8Mbps download. Wow! That’s yet another 42% loss just for turning the security on, which every website out there says you MUST do.
The loss due to the security setting motivated me to run tests against the various security options my router supports. It turns out that all WPA options and WEP 128bits basically lead to the same poor results.
Setting security to WEP 64bits is the only security option that doesn’t severely impact performance: ~13Mbps.
Sad state of affair!
WEP is known to be very weak and easy to break in minutes by a knowledgeable hacker. 64bits is that much faster to break than 128bits obviously.
So here you have it. The choice is between fast and unsecured or secured and slow. Stuck between a rock and a hard place.
Obviously results will vary depending on the router you use but, here is the rub: when shopping for routers I find very little/no info on the impact of turning security on. Most products claims are typically in optimal circumstances, as in “up to xxx”. and relative, as in “10x faster than xxx). This is of no help determining what performance you will actually get.
One thing that plays a role in the performance you get is the CPU your router is equipped with. Yet, from what I’ve seen, this is not a piece of information that is readily available.
To make matters worse, from what I’ve seen, websites such as CNET don’t highlight that aspect either. So, you’re pretty much on your own to figure it out.
Beware. Run some tests and see for yourself what you get.
I’ve thought about posting here several times over the last many months and I even have several drafts that never saw the light but this just keeps getting pushed down too far on my priority list to happen. However, I have to react to the buzz I’m discovering on my return from a week off.
Indeed, it is with quite a bit of astonishment that I read about Alex Brown’s frustration over Microsoft lack of interest in implementing ISO/IEC 29500 (OOXML). In the burgeoning comment sections following his post, Alex writes:
> The outcome that many had predicted, yet you
> insisted would not occur
Oh? I don’t recall making predictions about Microsoft’s behaviour? URL please!
Well, let me give you a link to a prediction I made! In my post What Microsoft’s track record tells us about OOXML’s future of March 25, 2008 I wrote:
They can, and I predict will, ignore all these additions which are optional and stick to what they have. The only reason they were added was to remove reasons for National Bodies to vote against OOXML.
So, here we are. Two years later, Microsoft has done exactly that and Alex Brown is finally seeing the light.
One can only hope that the standards community will have at least learned a lesson from this sad story: you simply cannot take control away from a vendor who has a monopoly and isn’t willing to give it up through a mere standardization process.
The leaked updated document of the European Interoperability Framework (EIF) is generating a lot of noise and for good reason. It is taking back what could be considered one of the most advanced features of the previous document: its insistence on the use of open standards.
In particular, the new document contains the following puzzling piece instead:
interoperability can also be obtained without openness, for example via homogeneity of the ICT systems, which implies that all partners use, or agree to use, the same solution to implement a European Public Service.
I don’t know about you but, to me this statement simply makes no sense. And I wonder to whom it could truly make sense.
Indeed, interoperability is defined in wikipedia as “a property referring to the ability of diverse systems and organizations to work together”. That seems about right to me.
So, how could “homogeneity” possibly qualify has a way of obtaining “interoperability”? Aren’t “homogeneity” and “diverse” opposing each other?
Saying that “interoperability can be obtained [...] via homogeneity” is equivalent to saying that diverse systems and organizations can work together via homogeneity. Or in other words that diversity can be dealt with via homogeneity. This doesn’t make sense, does it?
The only way to make sense of this is obviously to read it as saying that one can actually avoid the need for interoperability by adopting an homogeneous system or solution. That is true actually. And that’s something many organizations have tried before. But everybody has learned by now that this is a losing proposition. It just doesn’t work.
It may work on a short term basis, but in the long term in never does. Because the world is fundamentally heterogeneous. And resistance in this domain is futile. One way or another the heterogeneous aspect of nature will eventually kick in. Some of the most common sources of heterogeneity in IT simply comes from mergers and acquisitions, which happen all the time.
Furthermore, isn’t the whole point of the European Interoperability Framework about enabling heterogeneity? Isn’t it all about providing choice? So, why would the EU endorse the notion of having everybody select one specific solution or system? Isn’t that in total contradiction with its very goal?
Why would the EU promote the use of one specific system or solution that would bind governments and their constituents to a specific vendor rather than allowing diversity and choice? I seriously wonder.
And one has to wonder who has to gain from such an idea… For sure anyone who has a monopoly or quasi-monopoly would love that. Do you know anyone?
I seriously hope the EU realizes how misguided this move was and takes it back.
Especially because this flies in the face of the current trend in favor of open standards and open source that has recently made Europe so interesting in the field of standards. This is what has led several other countries such as Japan to reach out to Europe to discuss standards related policy issues. It’d be a shame to kill that momentum.