Category — Science & Technology
An Electric Car (at least part of the time…)
There is a time when older vehicles start to become rather expensive to keep running, and with both our main vehicle, a 2001 Freelander, and our second car, a 2001 Focus that was a gift from friends, having had expensive or potentially expensive problems recently (and the Freelander has very nearly done the equivalent of going to the Moon), we thought it was time to consider something rather newer.
As we are trying to become rather greener in our lifestyles, an electric vehicle would be the ideal. But frankly, as it stands today, we can’t get the range from a ‘pure’ electric vehicle to do the sort of things we need to do (which includes a 200-mile round-trip once a week in my case, and more occasional long-distance trips, for example to Scotland). So the obvious thing to do was to look at hybrids. There is no way I could consider buying one new (and in fact I haven’t bought a new car since the 1970s, when someone wrote it off for me a few weeks after I bought it. I have this funny idea about not adding any new cars to the road…).
But what kind of hybrid? The obvious was one of the Toyota models. They’re built in the UK as far as I know, and they have a reputation for excellent build quality. But again, even a second-hand Prius was rather more than I had in mind pricewise. The next one down was a used Auris hybrid, and a very nice-looking car it is. A friend who knows the car said it behaved very well and was actually rather nippy.
However, although the Auris delivers good fuel efficiency – somewhere in the 75 mpg range I believe – it, like its bedfellows, is never a strictly “electric vehicle” – the wheels are driven by a combination of internal combustion engine (ICE) and electric motors. So you can never turn the ICE off. But while we needed a car that could do longer journeys (I would like ultimately to get us down to one car if at all possible), a lot of our driving is around Cambridgeshire and environs. That meant that another type of hybrid was actually more suited to our requirements: a PHEV (Plug-in Hybrid Electric Vehicle).
In a PHEV, the wheels are always driven by electric motors. This is a Good Thing as the drive train is much simpler (and thus, one hopes, more reliable) and much more efficient than all that engine-and-gearbox stuff. And you just put your foot down and go. The vehicle is powered by batteries, and you recharge them by plugging it in. But, and it’s an important and positive ‘but’, when the batteries are exhausted, an on-board ICE kicks in, driving a generator to continue powering the drive for as long as there is fuel available, essentially turning it into the equivalent of a diesel-electric locomotive – a ‘series-hybrid’ if you like (though by some definitions, a ‘hybrid’ has to have both systems able to drive the wheels). And because the ICE is only running a generator, it can always run at the most efficient speed, which saves an enormous amount of fuel to begin with. Overall, you get the benefits of an electric vehicle – no fossil fuels are used as long as you don’t exceed the electric-only range; and it’s quiet, powerful and extremely efficent – without the range anxiety. And when you are driving on the ICE, you get superb fuel efficiency.
There are not very many of these kinds of vehicles around in the UK. Discounting the new Mitsubishi Outlander PHEV version and the BMW i3, both of which are well outside our price range, you’re left with two: the Chevrolet Volt and the Vauxhall Ampera. Chevrolet and Vauxhall are, of course, both General Motors, and these are basically the same vehicle, the Volt being the original, released in MY 2011. The Ampera is the Europeanised version of the Volt. GM don’t use the term ‘hybrid’ for the vehicle: they prefer E‑REV, or ‘Extended Range Electric Vehicle’.
Chevy is being wound down in the UK. And while Volts have been very successful in the US (and remain so – a new version comes out next year), neither variant did tremendously well in Europe, despite the Ampera winning a bunch of awards including Car of the Year in 2012, the year it came out here: there are about 6,000 on the road. It seems likely that this is because they were rather expensive when new – up in the fairly-large-BMW bracket while being a mid-sized reasonably luxurious hatchback. So I was expecting this to be out of range too… but not so! Although they have held their value pretty well, I was able to find a couple of 2012 Amperas – one not too far away – that we could actually afford. And following a test drive, we went for it. Previously owned by the dealership owner’s wife, it has been very well looked after; and it’s a very cool-looking Summit White.
I studied the forums and other information sources thoroughly before purchase, and as far as I could discover, it is one of the most reliable vehicles GM has ever produced: a known small risk of battery fire was fixed before the vehicles were even made for Europe; and while there is a known issue with a rather important bearing, only about 1–2% of vehicles have it fail and the problem and its solution are well-documented. According to a cleantech-oriented friend in the US, the Volt owners she knows are very pleased with their purchase.
The vehicle is extremely pleasant to drive, smooth and quiet, and even when the petrol engine finally kicks in, it’s still smooth and quiet and the performance (which includes its rather impressive acceleration) virtually unimpaired. The literature quotes the pure-electric range as “25–50 miles” – and that’s exactly what you get, depending on driving style and whether you have the heating on or not. On my first drive I got 48.8 miles out of the battery. The next day, leaving early on a cold morning, it went down to a mere 36 (tip: ‘pre-condition’ the driving compartment before leaving, while it’s still plugged in, which you can set it to do automatically).
The vehicle keeps a record of lifetime fuel efficiency. When I bought it, it was 110mpg (with 35,000 miles on the clock). I now have it up to 111. And indeed, as I expected, trips around Cambridgeshire can be made entirely on battery power – and if I can charge the car while the solar panels are outputting significantly more than we’re using, that operation is essentially free. Even on my weekly 200-mile round trip I managed over 90 mpg, thanks to being able to charge the car at my destination (where the Director of Marketing has a Tesla and is happy to share his charger) as well as at home. This knocks spots off a conventional Hybrid Synergy system. The car is learning what mileage I get from the batteries. When I first charged it, it estimated my battery range as 26 miles. It now thinks I’ll get 46. And that’s pretty much what I get.
It made sense to have a car charger fitted to the wall next to the driveway, rather than stick a cable out of the window, and there is a Government OLEV subsidy scheme that pays for a good chunk of the installation of a charger. I got mine (left) from ChargeMaster PLC in Luton, who were great to deal with – and having proposed a date, they actually came a couple of weeks early thanks to a cancellation. Charging the car from flat using the supplied EVSE (Electric Vehicle Supply Equipment), which plugs into a standard domestic socket, takes about 6 hours at around 11A charging current. However if you have a charger installed, you can charge in about 4 hours at 16A.
The Volt/Ampera has what is called a Type 1 (or J1772) connector (right), a fairly compact latching plug that goes into the left front of the vehicle. However most of the chargers you find in the wild in Europe are equipped with what are called Type 2, or Mennekes connectors (left). It made sense, therefore, to get a cable from one to the other so I can charge the vehicle at a public charging point at the destination (there is rather less point charging ‘on the road’ as the charging rate is only about 16 miles an hour, and that’s what the ICE is for!). Having this cable in the back of the car, it made sense to have a Type 2 socket on the home charger instead of the more usual tethered Type 1; and while I was at it, I future-proofed myself by getting a 30A charger in case friends with a Tesla call round or we upgrade down the line.
I would note when it comes to public charging sites, although there are quite a lot of them (more all the time, and many will take a Type 2 plug), they all belong to different networks that generally don’t have exchange agreements. As a result you may find you need a pack of RFID cards from the common networks and wave the right one over the charger to unlock it. In fact 85% of charging is carried out at home, and as I note, I won’t normally be plugging-in at motorway services, but I still want to be able to use a public charger at the end of a long journey, so having those cards (several of which are free) is probably worth doing.
(Main photo: General Motors/Vauxhall)
April 26, 2015 Comments Off on An Electric Car (at least part of the time…)
Solar panels — a year on
We wanted to install solar panels for years — in my case decades, since I was involved in the “Alternative Technology” magazine Undercurrents in the 1970s. In the past, the idea of a solar PV system has just been too expensive (friends down the street paid £15,000 for their system just a few years ago), but we’d been watching prices fall until, by the middle of 2014, it looked as if prices had fallen to an affordable level.
We interviewed four companies and it quickly became evident that the height of the roof wouldn’t allow the conventional 16 panels in two rows “portrait” style that is common for a 4kWp system – they would have to be mounted too close to the top and bottom of the roof (you need 500mm clearance all round — otherwise you can risk less stability in high winds). We could, however, manage two rows of six, “landscape” style. The companies we talked to varied in the amount of work they did specifying the installation, and I regard actually getting up into the loft and taking real measurements as an indicator that the installer is worth considering.
The limitation of 12 panels immediately made the choice a relatively simple one. We needed high efficiency panels, and the SunPower design, it was easy to see from the data sheets, was not only superior in engineering terms (they are not only more efficient, but they have a sturdy backplane system with no buses running down the front of the panels, making them less prone to damage; and if a cell does get damaged, it doesn’t take the whole row out or worse), it also enabled us to install a system that would deliver a little under 4kWp from just 12 panels. Perfect! Two companies out of the four had offered us SunPower panels. One was an enormous supplier in the Midlands that in fact I would recommend for anyone looking for a commercial installation, but they were rather expensive — significantly more than any of the others.
We selected our supplier, Solarworks of Lavenham in Suffolk, who have been installing renewable systems since 1983. Just a couple of weeks later the scaffolding arrived and while it was set up, Solarworks fitted the inverter – an ABB “Uno” single-phase model – and associated switchgear in the closet under the stairs (see picture left — note the black rotary switch bottom right, which is a proper DC isolator on the input path from the panels — which were still to be hooked up when this picture was taken). Above the AC isolator on the left is the Generation Meter. The next day, they installed the mounting rails on the roof. Because our panels were to be mounted horizontally, the rails were vertical and each of the 12 was attached to a different rafter, giving exceptional strength.
The following day, the panels went up, and as soon as they were connected, by mid-late afternoon – in two strings of six each – the inverter was indicating that we were generating 3.6kW of electricity. And the story has continued, with the system regularly generating more kWh than we use in an average day. This year, we saw the output exceed 3.7kW as early as March! (Which surprised me in fact, as you would have thought there would be losses between the 3.9kWp nominal panels and the inverter.) The installation, just after completion, is shown above.
We’re very pleased with the results and would recommend both SunPower panels and Solarworks as an installer.
We subsequently had our old Ferranti rotating-disc import meter replaced so that it wouldn’t go backwards. The latter sounds like a cool thing but actually isn’t, because you are already being paid for the electricity you are exporting and the electricity supplier can claim it back retrospectively; plus I wanted a modern meter with an LED indicator on to which I could strap a counter for metering.
The metering system I installed came from Geo (Green Energy Options) in Cambridge. It measures the power output from the panels (via the flashing light on the Generation Meter), the amount imported from the Grid (via the flashing light on the new Import Meter), and the raw current flow in or out of the building (from a clip around the main power input cable), and calculates a range of data from those raw inputs. Very nice. On the display shown here, the blue curve represents the output from the panels (quite good for an overcast day, I think) and the orange is the amount of energy we’re using – these values are shown numerically in the centre left of the display. The little blue arrows at the bottom show we are exporting electricity, and the little green waveform above the wattage displays indicates that we have enough “free” power to run a major appliance such as a washing machine or dishwasher, without effectively paying for it; and on the right is our electricity usage so far today and how much our income from generation and our spend have been. The system is connected to the Internet so you can remotely monitor system performance via the Web.
Our electricity supplier is Ecotricity, and setting up for their Microtricity scheme to receive the Feed-in Tariff (FiT) was simple to do. Now they are often banking with me, and have had to revise my electricity payments down significantly as a result.
Having had the panels installed for almost a year, it looks as if we are running somewhat ahead of schedule as far as these panels paying for themselves is concerned.
April 25, 2015 Comments Off on Solar panels — a year on
Changes at the top of the page
More or less since we moved this site from being a conventional static web site to a WordPress environment based around the Thesis meta-theme, the header image has been randomly selected: each time you visited the site, you would see a different image.
This is actually very easy to do — there is a tutorial here — and it’s worked well. However, the other day I thought it would be rather neat for the header to consist of essentially a slideshow of the available images, gently crossfading.
One of the neat things about WordPress is that there are a great many plugins out there, many of them free, which you can locate to do things like this. For the Radio Riel site, I used a plugin called the Smart Slideshow Widget, for example (it replaced a rather fiddly flash slideshow that I used on the old RR site). The widget appears in the left sidebar to display a continuous and randomly-rotating set of logos for the station’s sponsors. However, this system only provides a widget: you can’t use it for a header image.
Go and search WordPress plugins for slideshows and you will find a great many, but most of them are a lot cleverer than I wanted. I just wanted to be able to stuff a set of images in a folder and have them display for a set period and crossfade over a set time. I definitely wanted to avoid flash (if I didn’t, I already have the tools to create flash slideshows, but flash is… a pain). This left me essentially with Javascript as the way to do the crossfading, and JQuery (already running on this site) or one of the other libraries will do that and lots more.
Many of the slideshow systems used the standard WordPress media upload system, which again was rather more than I needed. I started to install one of them and noticed that it messed with the Featured Image feature of WordPress. This rather warned me off, as I am already messing with the Featured Image capability to get it to work with Thesis — I’m using WordPress Featured Image for Thesis Theme from Thesistut, and I didn’t want to mess that up. And anyway, it was more complicated than I needed.
I finally found what I was looking for in the form of Cimy Header Image Rotator from Marco Cimmino. The plugin gives you some useful display options, such as a “Ken Burns Effect” instead of a simple crossfade — that’s the rostrum camera effect used extensively by film-maker Ken Burns, notably in his landmark series The Civil War (and now apparently a requirement for any PBS documentary), where rather than simply display a static image, you gently zoom in on it. The plugin also lets you include a caption and a link — either for all the images or for each.
The plugin has an upload folder (and you can actually upload additional files right in the plugin, which was a bonus) and once you’ve defined the parameters you get a bit of code to copy and paste into the appropriate Thesis hook (if you’re using Thesis), in my case the hook after the header. I replaced the existing random static header code with the new piece and it worked straight away.
Of course I wanted to tweak it a bit. I needed to change the size and position of the header image panel a little, which was easy, and wanted to alter the way the captions were displayed. The default is a little black lozenge (aka a round-cornered box) at the centre bottom of the image with the text in white. I tried dropping the lozenge and using a CSS drop shadow in black behind the text to pop the caption out of the image, whether it was dark or light, but the drop shadow wasn’t strong enough to do the job on really pale backgrounds. Eventually I lowered the caption to just under the image and made it black.
More tricky (and unsolved currently), I wanted to move the caption to be ranged left rather than centred. Unfortunately, the caption is positioned with respect to the entire browser window, and I have the page centred in the window so if you enlarge the window you get more air around the page equally on either side. This is fine if the caption is centred, but if it isn’t, you can’t define the location of the caption relative to the image. I will sort this out another day. In the meantime, enjoy the pics.
October 19, 2012 Comments Off on Changes at the top of the page
Pitfalls of Facebook Page Tabs
I recently had occasion to create a couple of Facebook Apps for a client, to be accessed from tabs under the main timeline image on a business page.
There are plenty of tutorials around on how to do this, but I found a couple of pitfalls that don’t seem to be mentioned in the writeups I’ve seen.
Facebook Page Tabs
One of the many things that has changed about Facebook recently is the way that tabs on Business Pages are handled — this happened from April this year.
To begin with, you can no longer set up a tab as a landing page so non-fans who arrive at the page see it automatically: tabs are simply listed with thumbnail images under the header image. And while the content was previously limited to 520px width, you can now choose 810px wide as an alternative.
I was creating the very simplest of Facebook Apps: ones that simply call what is essentially a web page that gets embedded in a Facebook page with the Facebook equivalent of an iframe. The basic tutorial on this provided by Facebook can be found here.
To do this, you log into Facebook as a Developer (https://developers.facebook.com) using your usual credentials. Just to right of centre in the blue Facebook strip at the top of the page you’ll see “Apps”. Click this link and it shows you any apps you have created previously and there’s a button top right called “Create New App”.
As described in the tutorials, you need to give your new app a Display Name that appears beneath it on the Page; a unique Namespace; and a support contact email address. Then from the pop-up menu “Category”, choose “Apps for Pages”. For this simple app, that’s all you need in the top section. The tricky bit is next.
Selecting how your app integrates with Facebook
You now select how your app integrates with Facebook. Selecting “Page Tab” is obvious. You give the Page Tab a name; and supply secure and non-secure URLs. These point at the web location where the HTML content you want to display inserted on the Facebook page can be found. Providing a secure URL (ie one accessed via https:) is mandatory. It doesn’t seem to matter if the two URLs point to the same place, but I would suggest it’s a lot easier if they can, and your server is set up to serve both secure and non-secure content.
It may not be obligatory, but it turns out to be simplest if the URLs point to a directory in which the desired content is the default page: in other words, call your page “index.html” or whatever your server is set up to serve as default, and put it in a directory, for example one named after the application. So you might have a URL like “http://myserver.com/facebook/welcome/” — in other words, the URL points at a directory. This is what you want to enter into the Page Tab URL slot: with the forward-slash on the end.
You don’t need to provide a page tab edit URL. What you probably will want to include is a 111 x 74px image to appear in the box under the header on the main Facebook Page.
In addition, select the width of the inserted HTML material. You can use the original 520px width or the new 810px. Remember that whichever you choose, the HTML will be displayed on a blank white page with a Facebook blue strip at the top and little else: it should work visually in this environment. It turns out that the width is not quite as straightforward as it appears, as will be seen below.
On the face of it, you’re now done as far as the Facebook side of the setup is concerned. Wrong. There is another step you need to take, and that is to click the check-mark next to “App on Facebook”. This asks you to enter Canvas URL and Secure Canvas URL. Enter exactly the same URLs as you used above. This step is not obvious (don’t you just need to select Page Tab? No.) but if you don’t do it, you will get an error 191 when you try to add the App to a page. I have not seen this documented anywhere: have you? And if you look up Error 191, you’ll find that absolutely everyone gets this error and that nobody has suggested that filling in the “App on Facebook” tab is the solution, or what should go in there.
Once the above has been completed, you can go off and create your mini-pages that will be stored at the URLs above and will be displayed in an iframe on the Facebook page when visitors click on your app tab.
You can create the HTML in your favourite editor: I use Dreamweaver, but you can even create it in the Post Editor in WordPress, then click the HTML tab in the Editor and copy the code out, save it in a file with the right name, and upload it to the server.
In my case I had a 520px-wide x 775px high image and I simply placed it centred on the page and created a local image map to allow me to make it clickable, with different parts of the image taking visitors to different URLs on the client’s main web site. The client’s format involves black backgrounds, and the image had this, so I set the page background colour to black too, just in case. This proved to be inadvisable.
Adding the App to a Page
Once the page content is in position on your server, you are essentially done, and you can go back to Facebook and enter the special URL that allows you to select which of the Pages you administer the App should be added to. Why this isn’t easier is beyond me. A button would have been nice. The URL is:
https://www.facebook.com/dialog/pagetab?app_id=YOUR_APP_ID&next=YOUR_URL
You get the App ID from the Settings page of your App, in the top section. The “Your URL” is the tricky bit. It’s the Canvas URL, and this is why it needs to be there. If it isn’t, then whatever you put in there, you’ll get the 191 error, which seems to relate to where the app redirects you to when you click on it.
All being well, though, you will get a nice little page that allows you to choose one of the pages you administer and add the App to the Page. You can then go to the Page and move the order of the tabs around to suit you (the “Photos” tab has to be first, for some reason, but you can move the others by swapping them around).
When 520px is actually 512
If you created the page the way I described above, with an image and imagemap, you will notice straight away that it hasn’t quite worked. The same is true if you sliced your image into bits, each with its own linked, and positioned them with a table. You will (probably, I assume it wasn’t just me) see horizontal and vertical scroll bars. WTF?
By simply reducing the width one pixel at a time I discovered that both scroll bars go away if you set the width to 512px. Nowhere have I seen this documented, so I would be fascinated to know under what circumstances this appears. So I trimmed the image a little so that it was 512px wide and lo! It worked!
And now you run into another little aesthetic issue. You’ll recall that I had my image on a black background. So I look at my image on the Facebook page and I notice a black strip down the left-hand side. It turns out that this strip is 8px wide. Hmmm. The image has to be 512 or it shows scroll bars, but it is placed in a 520px-wide space. Odd. Not only that, the image is ranged right in the space, even though the HTML centres it on the page. This appears to be the case whatever image positioning you use.
As a result, I had to remove the black background from the page, instead setting it to white, the background colour of the Facebook page. Nobody will notice that the image is 8px to the right.
June 15, 2012 Comments Off on Pitfalls of Facebook Page Tabs
Christmas(ish) At Beamish
Whenever I’m in the NE of England, I try to get over to Beamish - “The Living Museum of the North”. It’s a wonderful place built around a road/tramway loop on which run vintage buses and trams.
On this occasion (20 November) I was up for the weekend to go to Lumiere in Durham, so nipping over was a chance I couldn’t miss. It was foggy on leaving Durham but approaching Beamish the sun came out and it was gorgeously sunny until the drive home, when the fog closed in again.
Different sites around the tramway loop recreate different eras, each created from buildings that have been lovingly transplanted from their original sites: the Town, for example, is Edwardian, with a Bank, a gorgeous Masonic Hall (rebuilt with the help of the Masons, apparently), a Co-Op department store, sweet shop/factory and lots more. It also has an adjacent Steam Railway and station and a steam-powered fairground.
The Pit Village is perhaps somewhat earlier, and features a colliery and a relatively new addition: a coal-fired fish & chip shop that uses beef dripping to cook with, resulting in utterly tasty meals that you have to queue for twenty minutes or so to get, it’s so popular. Yet another area, Pockerley, is more Georgian, with a Waggonway that features steam locos from the earliest times and Pockerley Old Hall. I’ve talked about Beamish before, here.
From this time of year until Christmas itself, Beamish is having a series of Christmas weekends, including Santa’s Grotto somewhere over by Pockerley I think, complete with snow, an ice-rink in the Colliery Village (above), and decorations up in the Town.
I had to pop into some of the terraced houses, several of which contain businesses, such as a solicitor’s and a dentist — the torture chamber itself is shown below. In those days you would have had the option of (unregulated) nitrous oxide (with a fair risk of death) or cocaine as anaesthetics, the latter effectively removing your short-term memory, so things hurt but you didn’t remember it (rather like intravenous Valium it would appear, which I always loved as an adjunct to dental operations).
Another house included period Christmas decorations in the front room.
Across the street is a little park, with a bandstand, and there was the Murton Colliery Band preparing to play some suitably seasonal music, which they proceeded to do beautifully.
Here’s some video of extracts from their programme:
The band was formed as the Murton Gospel Temperance Blue Ribbon Army Band in 1884, and players were requested to wear a blue ribbon on the second button of their waistcoats. They became Murton Colliery band in 1895. When the colliery closed, the band became self-supporting — and it still is today. They’re also one of the few remaining bands to continue to call itself a ‘Colliery Band’, and they still proudly march through the village during the Durham Miners Gala and Armistice Day. I don’t know about you, but brass band music and Christmas do seem to go together rather well.
There was time for a good wander around and trips on some of the trams — including a 1930s enclosed double-decker Blackpool tram, which is technically a little late for their re-creations but very impressive — and I had some good chats with the tramway staff, noticing that they wore the archetypal “wheel and magnet” emblem of British Electric Traction (later to become the parent, surprisingly, of Rediffusion Television) on their caps. The shop at Beamish should sell those cap badges — I would have bought at least one.
Finally it was time to head off on the 3+ hour home, and soon after getting back on the A1 the fog closed in, and it ended up taking a good deal longer than that. But it was a great day out.
November 23, 2011 Comments Off on Christmas(ish) At Beamish
75 Years of BBC Television
Wednesday 2nd November saw the 75th anniversary of the opening of the BBC Television Service.
To commemorate the event, the BBC held a special celebration at Alexandra Palace, where the Service opened.
Originally, the intention was to hold a special Open Day on the 2nd, at which members of the public would be able to visit the studios and see audio-visual presentations. However this was eventually moved to November 5–6, leaving only an internal BBC event happening on the actual day.
I managed to obtain an invitation, for which my thanks to the ebullient Robert Seatter, head of BBC History, and technology journalist Bill Thompson.
The invitation said “3:45 for 4pm” and as a result I found myself in the Alexandra Palace Tower end car park well in time for the off, giving some time to take in the views over the city, experience the continual wind and enjoy some dramatic skies over this “Palace of the People” located at the highest point in North London.
When the BBC decided on Ally Pally as the site for the new BBC Television Service in the wake of the Selsdon Report in 1936, the place was already decaying somewhat. It’s a process that has continued since BBC Television left here several decades ago, and although the team now fronting the Trust that runs the site today is incredibly, and impressively, enthusiastic and upbeat, there is no way it can be other than an uphill struggle in these austere times. But you can’t say they aren’t trying hard and I wish them every success.
The BBC still maintains active offices in the block under the mast. But instead of entering through the doors there, adjacent to the GLC blue commemorative plaque on the wall, we were motioned into an entrance along to the left, up a metal ramp and into what had originally been the Transmitter Hall. It may be noted that this was probably not the first, but possibly the last, time that anyone had the bright idea of placing a pair of powerful VHF transmitters and a pigging great set of transmitting antennae right next to a set of television studios full of sensitive equipment.
Inside, the room had been decorated with panels against the walls, each carrying information and images of some aspect of Ally Pally TV history, and a free-standing photo display of historical images, mainly provided by the Alexandra Palace Television Society. A jazz quartet played suitable 1930s style music; servers glided among the assembled invitees dispensing water, orange juice or Prosecco.
We had the chance to mingle and chat, and I was very pleased to meet TV cook Zena Skinner, who probably coined the phrase “Here’s one I made earlier” — though in her case she really had made it earlier, herself; I also met Professor Jean Seaton, the BBC’s Official Historian and Professor of Media History at the University of Westminster; and talked briefly to John Trenouth, Technology Adviser to the BBC Collection, whom I met during his time at what is now the National Media Museum in Bradford.
In the centre of the room, a make-up table and lights were set up, where various young women were being made up using the colours required by the Baird System.
When the BBC Television Service was established, the Government required two television systems to be used. On the one hand was the all-electronic Marconi-EMI system, which offered 405 lines, and on the other was the Baird electromechanical system which delivered 240-line television. Early on, it became evident that the Marconi-EMI system was significantly superior, but it had been Baird who had tirelessly promoted television as a concept, and lobbied the GPO over licensing and the Government to legislate for a Television Service. Baird highlighted the fact that his was a British invention – though it could equally legitimately be claimed that the Marconi-EMI system was British. Almost certainly the Government decision, a typical British compromise, was made at least in part to avoid suggestions that they were turning down a British innovation, the decision mandating the use of both systems on an alternating basis for six months before a choice was to be made before the two. The problems experienced with the technological dead-end of the Baird mechanical scanning system resulted in the decision — in favour of Marconi-EMI — to be made after just three months.
Baird Television actually used two systems. The fundamental feature of both was a “flying spot scanner” in which, almost completely counter-intuitively, the scene was scanned with a spot of light and photocells collected the light reflected from the subject. The “Spotlight Studio” used nothing more than this; the Intermediate Film Technique used a conventional film camera, exposed film from which was then passed immediately through developer and highly poisonous cyanide-based fixer (particularly nasty when it got loose), then scanned with a a flying spot actually under water. The flying spot scanner was very sensitive to red light, so if you were appearing in the Spotlight Studio, you needed the special make up: black lipstick, blue eye-shadow and a pale white face. Very neo-Goth. You checked it by looking through a red gel.
This was the make-up that was being applied to the young ladies at Ally Pally on the 2nd. Apparently the idea had originally been that BBC London would be sending a crew up to cover the party, but they had pulled out and the job was left to an enthusiastic team from BBC News School Report.
Meanwhile, we were treated to welcoming presentations: by the PR gentleman from the AP team, and from Robert Seatter, who encouraged us to relinquish our glasses and proceed upstairs to Studio A.
There were two main studios at Ally Pally originally, one above the other. Studio A was the Marconi-EMI studio, while directly above it was the Baird studio, Studio B. You can’t go into B today, because it’s riddled with asbestos and things are likely to fall on your head. But Studio A is accessible. At one end of the room is a tableau representing the production of the magazine programme Picture Page, which ran from 1936–39 and 1946–52 and was initially presented by Joan Miller.
Around the room are assembled old TV sets, and various exhibits in the room itself included an EMItron camera, which John Trenouth of the National Media Museum in Bradford kindly removed the lid of so we could have a look at the innards (sans tube).
In Studio A we were treated to a couple of brief audio-visual presentations, the first assembled mainly from clips from the film documentary Television Comes To London, which was made to tell the BBC Television Service story in 1936. Rebecca Kane, the MD of Alexandra Palace Trading Ltd, introduced Michael Aspel, a newsreader at AP during the period when BBC Television News was based here, to cut the cake.
And what a cake it was: made in the form of an old bakelite television with a picture of Alexandra Palace on the screen, deliciously thick icing and succulent innards. Very nice.
After that, we all wandered around Studio A and chatted to each other. I got into an amusing discussion about the way in which the Television Service closed down at the start of the Second World War, on September 1st, 1939 – about which a number of myths have arisen, most of which are incorrect (including the perpetuation of the main myth in Alan Yentob’s Imagine documentary, re-shown on Wednesday) – see The Edit that Rewrote History on the Transdiffusion Baird site, which includes a number of articles on television prior to 1955.
And then we gradually sloped off home.
See also:
The birth of television: the “Baird” microsite at Transdiffusion
75 years on from BBC television’s technology battle — a nice piece by John Trenouth
BBC Celebrates 75 Years of TV — Nick Higham visits Alexandra Palace
November 5, 2011 Comments Off on 75 Years of BBC Television
Nuclear Power You Can Trust?
Having been involved in the environmental movement in one way or another since the 1970s, I’ve always been in the “anti-nuclear” camp.
Indeed, I think I was the first person to create an English version of the famous “Atomkraft? Nein Danke” logo – for the cover of an edition of Undercurrents magazine – a magazine that was into renewables (mainly of the DIY variety) before a lot of people. (You can read some copies of it here.)
Of course there are plenty of reasons to be wary of nuclear power – of the current variety at least.
- There’s the question of energy security: Uranium doesn’t come from here, we have to import it, or reprocess other peoples’. So although I gather there might be deposits off the British coast, it doesn’t seem at this point to help decouple us from potential problems with dependence on overseas sources.
- There’s the problem of nuclear waste disposal, though some people (James Lovelock for example) are convinced that this can be done safely and permanently.
- Nuclear power as we currently do it is absurdly inefficient. What you do is you let radioactive decay heat some water and then pass it through turbines. It’s just like a conventional power station, except you heat the water differently. I can imagine the efficiency is significantly less than 50%. Whatever happened to innovative direct conversion technologies like MHD (MagnetoHydroDynamics), where, for example, you can run a plasma back and forth in a magnetic field and pull electricity directly off the plasma, in a kind of fluid dynamo? The Soviets had some pilot plants generating several megawatts. What happened?
- And there’s the risk of disastrous accidents, like Chernobyl, Three Mile Island and now Fukushima, which can potentially spread significant amounts of irradiated material over a wide area, with potential health effects like increased long-term cancer risk and other problems beyond the direct effects of radiation poisoning.
Counter to the last of these, there’s the fact that remarkably few people have actually been affected by radiation from nuclear power plants. Many, many fewer than have been killed or injured by coal-mining accidents and other fossil-fuel-related disasters. If Germany was as sensitive to risks to life from bacteria as it is from nuclear power, it would have closed down the organic food industry by now. But instead, it’s closing down its nuclear plants, which, as far as I know, have not caused any deaths at all, unlike the contaminated beansprouts.
But of course, it’s never as simple as that.
The fact is that right now we need low-carbon energy sources, and quickly, to combat the threat of anthropogenic (human-created) global warming (AGW). There is no doubt about the threat of AGW, and I’m not going to entertain discussion about it here. Sorry.
Much as I am in favour of renewables, and much as I like the sight of elegant, virtually silent wind turbines dotting the landscape (and I would as happily have some in the field behind my house as James Lovelock would have a nuclear waste storage facility behind his), the fact is that renewables are almost certainly not enough, and we need something more to replace our ageing and horrifyingly destructive carbon-spewing fossil-fuel powered generating stations. Nuclear is the obvious option, so after years of taking an anti-nuclear stance, I am changing my mind. And in doing so find myself aligned with people like George Monbiot and Professor Lovelock.
In my opinion, even if we did no better in the international nuclear power industry than we have done to date, any threat to human life from nuclear power, past, present and future, is as nothing compared to the billions whose lives are threatened by AGW and will be over the 50–100 years ahead.
I will be a little controversial and say that in my personal view (and I am not a nuclear power expert, so may be wrong), the current level of nuclear power technology is much safer than the chain that ends in a conventional fossil-fuel-driven power station. That, to me, is not the question.
Instead, the question is, can we trust anyone to build, maintain and operate nuclear power stations safely?
You could argue that by and large, the answer to that question is yes. Nuclear power as it is practised today is in fact extremely safe compared with fossil-fuel generation. But there is a bit of a knife edge here. Fundamentally, however intrinsically safe the current technology is, the fact is that I do not trust for-profit corporations to do the job properly. I am not even sure I trust governments. They will always be looking to cut corners and save money, time or whatever else, and the result will be a greatly increased risk. Take a look at this:
This is the segment on nuclear power from Adam Curtis’s Pandora’s Box series on some misuses of scientific research. I’m a big fan of Curtis’s work (although I have some issues with his latest series, All Watched Over By Machines of Loving Grace) and I think the above is spot on.
So, I think the technology of current nuclear power is fine in theory, but we are going to screw it up in practice. How can we have our cake and eat it? What we need is a method of nuclear power generation that you can’t screw up [very easily].
The answer just might be hinted at in this article from, of all places The Mail On Sunday, a paper I would never have thought I’d find myself recommending in, er, a month of Sundays. It’s also recommended by the climate-sceptic Global Warming Policy Foundation. Talk about strange bedfellows….
The piece is about the “Electron Model of Many Applications”, or EMMA. Here’s the article. Research into this technology is going on in Cheshire and it might just provide the key to one method of using Thorium in a reactor to generate electricity – assuming the UK government continues funding the research properly, which I doubt. Here’s the beginning of the piece:
“Imagine a safe, clean nuclear reactor that used a fuel that was hugely abundant, produced only minute quantities of radioactive waste and was almost impossible to adapt to make weapons. It sounds too good to be true, but this isn’t science fiction. This is what lies in store if we harness the power of a silvery metal found in river sands, soil and granite rock the world over: thorium.
One ton of thorium can produce as much energy as 200 tons of uranium, or 3.5 million tons of coal, and the thorium deposits that have already been identified would meet the entire world’s energy needs for at least 10,000 years. Unlike uranium, it’s easy and cheap to refine, and it’s far less toxic. Happily, it produces energy without producing any carbon dioxide: so an economy that ran on thorium power would have virtually no carbon footprint.
Better still, a thorium reactor would be incapable of having a meltdown, and would generate only 0.6 per cent of the radioactive waste of a conventional nuclear plant. It could even be adapted to ‘burn’ existing, stockpiled uranium waste in its core, thus enormously reducing its radioactive half-life and toxicity.…”
It seems to me that this technology could answer many, if not all, of the environmental concerns about the acceptability of nuclear power. Of course I want to read the full report that is apparently soon to be published, and no technology comes without drawbacks (or unintended consequences for that matter), but preliminary accounts, like the one above, seem to offer promise.
For more on other possible uses of Thorium for power generations, see this Wikipedia article. You’ll see it’s not entirely problem-free – but then nothing is.
*Header image from MensPulpMags.com
June 21, 2011 Comments Off on Nuclear Power You Can Trust?
Re-learning basic life skills
I remember clearly one of the first pieces of really useful information I ever got from the World Wide Web.
It was back, probably, in the early-to-mid 1990s, when I was essentially coding HTML by hand, as one had to do. The previous year, I’d completed a demonstration of what a magazine I was working on at the time might look like on the web as a method of international electronic distribution instead of sending PageMaker files to various locations via AppleLink, and the client had liked it. I was interested in finding out how to make it, and other sites, look better.
I stumbled upon the web site of a designer and digital typographer. My memory suggests (though I could be wrong about this) that he was David Siegel, the designer of the Tekton font, who was demonstrating techniques for making your web pages look halfway decent from a design point of view, long before the advent of CSS and other web layout tools. That would make this in 1994 — I designed my first web site the previous year. Siegel went on to write the best-seller Creating Killer Websites.
In those day, the idea of the web was that it carried information, and that information had a structure and hierarchy — different levels of headings, text and so on — and as long as you identified those structural elements accordingly, that was all you did: the viewer decided what the fonts were and what the page actually looked like.
But it’s not web site design I’m talking about today. On one of his pages, I found a really fascinating set of illustrations. They were solely there to show how you could lay them out, but they were on the subject of how to tie your shoelaces.
Now you wouldn’t think there was a lot to learn about tying your shoelaces. It’s a life skill we learn really early. We also, I suspect, learn it essentially the same way. The page noted that the problem with this was that shoelaces, especially those round-section nylon ones, tended to come undone very easily. The diagrams showed a better way, that stopped this from happening. In a nutshell, what you do is instead of going once round and through, you go twice round and through. It’s not necessary to go into any finer details, as you’ll discover in a moment.
I immediately tried this, of course, and it worked! And that’s how I’ve tied my shoelaces ever since. Well, until the other day.
Back in 1994, I really never thought that I would be re-learning how to tie my shoelaces. But I am all in favour of learning new things — even if that means un-learning old things. So at the age of 43 or so, I learned this basic life skill all over again, and used it all the time for the next seven years or so.
The method he described has some issues, I should point out. The big one is that if you are unlucky how you pull an end to undo them, you can end up in a very complex knot that can take a while to untie. This, of course, will happen when you are in a hurry, or in the dark. But the benefit of the technique outweighed the downside.
Then the other day, I was getting to know the shiny black new Boxee Box I acquired. I’ve had Boxee on the little Mac Mini connected to the TV as a media centre type computer for ages but never used it that much. But with the Boxee Box it all becomes much more accessible and, give or take a few bugs which I am sure will get fixed over time, it’s a very impressive piece of kit.
One of the main ways of accessing content with Boxee is Apps, and one of them is for TED Talks. TED stands for Technology, Entertainment and Design. It’s a non-profit that holds two international conferences a year where some amazing speakers talk about some amazing things — you can learn more about them here. Their slogan is “Ideas worth spreading”. It’s where I first heard about the company Better Place, for example, and their amazingly sensible idea of having swappable electric car batteries so you don’t have to sit around while they charge (you can see the video here).
On the front page of the Boxee TED app is a set of panels promoting a selection of talks. One of them was from Terry Moore and it’s called How To Tie Your Shoes. I wondered immediately if he was showing what I might call “Siegel’s technique”. Well, he’s not. He’s showing you a new way of doing it that also doesn’t come undone — and doesn’t have the risk of knotting. It’s in fact both simpler and better. In essence, instead of going once round anticlockwise, you go once round clockwise, and get a stronger form of the knot (note that if you’re left-handed you may already be doing this). But don’t let me say any more: just watch the video. It’s only 3 minutes.
[vodpod id=Groupvideo.9234779&w=425&h=350&fv=vu=http://video.ted.com/talk/stream/2005/Blank/TerryMoore_2005-320k.mp4&su=http://images.ted.com/images/ted/tedindex/embed-posters/TerryMoore-2005.embed_thumbnail.jpg&vw=320&vh=240&ap=0&ti=1150&lang=eng&introDuration=15330&adDuration=4000&postAdDuration=830&adKeys=talk=terry_moore_how_to_tie_your_shoes;year=2005;theme=ted_in_3_minutes;theme=new_on_ted_com;theme=hidden_gems;event=TED2005;tag=Culture;tag=Entertainment;tag=demo;]
There are in fact loads of ways of tying your shoelaces. This web site suggests at least 18 possible knots and also describes the technique discussed above.
June 19, 2011 Comments Off on Re-learning basic life skills
A sad day for virtual Frank Lloyd Wright fans
The Frank Lloyd Wright Virtual Museum in Second Life is widely regarded not only as a wonderful revivification of the legacy of America’s greatest architect, but as one of the major points of interest in Second Life and one held in high regard by architects and those of an artistic bent, many of whom are drawn to virtual worlds.
The FLWVM contains fascinating exhibits on the life and works of Frank Lloyd Wright, 3D virtual reconstructions of his key buildings, and much more, and it’s hosted by knowledgeable and helpful staff. For the last year or so there has been a licensing agreement in place between FLWVM and the Frank Lloyd Wright Foundation, the organisation that controls Frank Lloyd Wright’s legacy.
One of the Foundation’s goals is to “Preserve the works, ideas, and innovative spirit of Frank Lloyd Wright for the benefit of all generations” – one of the things that the FLWVM definitely does. I was very much saddened and surprised at the decision announced recently, therefore, by the Foundation not only to terminate its licensing agreement with Virtual Museums, Inc, who run the FLWVM, but also to issue a Cease and Desist order effectively requiring them to close forthwith. The Virtual Museum will therefore close on December 10 unless something happens to change that.
You can read more about the story surrounding this decision here in Prim Perfect Magazine’s blog, and the letter sent to supporters of the FLWVM by the Chair of Virtual Museums, Inc, Ethan Westland.
As a result of that decision, I was moved to write the following email to the Foundation via their contact email address, info[at]franklloydwright.org. If you agree with me, you might want to do the same.
I was saddened to hear today of the imminent closure of the Frank Lloyd Wright Virtual Museum in the virtual world of Second Life as a result of your Foundation withdrawing its existing licensing agreement with Virtual Museums Inc and apparent decision not to renew it.
I was involved in a TV programme about the virtual museum some months ago and was exceptionally impressed at the work they have been doing promoting the work and legacy of America’s greatest architect in new areas of technology. It seemed to me at the time (the show went out just as the original licensing agreement was being signed) that the licensing arrangement was a perfect idea in that it enabled the Foundation’s work and goals, and an awareness of the work of this great man, to be extended into new realms with health and vigour.
I am thus extremely disappointed that the Foundation has decided to take the measures, not only of failing to renegotiate the licensing agreement or some other mutually beneficial agreement allowing the Virtual Museum to continue, but with the additional step of issuing a Cease and Desist order effectively causing the Museum to close immediately.
From what I have heard about this decision, it appears to me that the Foundation has been labouring under the misunderstanding that as a result of the licensing agreement, the FLWVM somehow assumed responsibility not only for its own creations based on copyright designs and content owned by the Foundation, but also those of completely unconnected third parties. I note this as a result of the fact that the Cease and Desist order was apparently sent to the Virtual Museum and not to Linden Lab, the creators of Second Life; nor did it take the form of a DMCA take-down order addressed to Linden Lab – the usual course of action in the case of perceived copyright infringements in the virtual world.
I would strongly urge the Foundation to reconsider its action in this case and consider instead re-opening negotiations with Virtual Museums Inc with a view to reaching a further mutually-beneficial licensing arrangement that would allow the Frank Lloyd Wright Virtual Museum – widely regarded as a prime example of the great possibilities of virtual worlds in promoting art, culture and design – to continue operating, contributing so effectively as it does to the legacy of this great man.
If you’re a Second Life resident and you want to visit the Museum before it closes on 10 December, this link will teleport you there.
December 3, 2010 Comments Off on A sad day for virtual Frank Lloyd Wright fans
When does ‘Skepticism’ become dogma?
For some considerable time, I’ve been a staunch follower of those, like Richard Dawkins, who oppose established religions and favour an evidence-based approach to our understanding of the world. Indeed, I think religion has caused more death, pain and suffering in the world than almost anything else and we would all be much better off without religious privilege.
I am actually more concerned with opposition to religion than I am with atheism. As far as I’m concerned, of course there isn’t any ‘evidence’ for God; thus God is hardly amenable to the scientific method and is purely a matter of personal belief. And tempting though it might be to think otherwise, my view is that people should be free to believe whatever they like as long as it doesn’t restrict my ability to do the same. Having studied a little occultism in my time, I know that beliefs are very powerful things.
They are very powerful, too, in areas that are more amenable to scientific enquiry, such as in the case of homœopathy. I am quite certain in my own mind that homœopathy is to be deprecated, and that “there’s nothing in it” in physical terms. The idea that water can contain the “memory” of specific substances, but not all the other substances that have passed through it at one time or another since the dawn of time (and still contain that even when the water is removed) seems ridiculous to me on a physical level.
On what we might call a “magical” level, however, it’s fine because belief systems are very powerful indeed and should not be underestimated. The scientific name for this particular magic, in the case of homœopathy, is “the placebo effect”, and it can literally work wonders. The fact is, however, that there really is nothing else to it, and for the National Health Service in the UK to spend money on placebos when it could spend it on medications that have been proved to have an objective effect, I find absurd. It is also absurd that vast amounts of money can be made by various companies selling “homœopathic” remedies that have nothing in them. (The real challenge as far as I am concerned is how do we harness the undeniable power of the placebo effect without being dishonest and unethical. However, this is not the purpose of this article.)
I am wholeheartedly behind the “skeptics”, therefore, when they pile in on topics like homœopathy, snake-oil “alternative” or “complementary” remedies of one kind or another and other examples of heinous woo, like “bomb detectors” based on dowsing (poorly-understood dowsing, not properly implemented at that, though I doubt that made any difference) that appear to quite literally kill people.
I’m in the audio field and nothing annoys me more than tales of special rocks or wooden coathangers that, when placed on top of audio components or in your listening room respectively, will allegedly make them sound better. I do not believe that electrons must pass through a cable in one direction only, or that they have to be “flushed out” from time to time by applying DC to them. Nor that speaker cables need to rest on ceramic pylons. In particular, I believe that digital audio does you no harm and even if it did, “applied kinesiology” would not tell you anything about it. And so on.
I am also firmly on the side of science when it comes to anthropogenic global warming. Indeed, there really isn’t an opposing view on this of any merit in the scientific community, and not because anyone is discouraged from looking or any of those other ‘denialist’ accusations, but because alternative theories just don’t have the evidence behind them. This is an example of one of those topics (like creationism) where balanced coverage ought to reflect the scientific consensus, and opposing arguments not simply be given equal time. Equal time is not balance: it represents bias towards the view deprecated by those best-placed to know, as I have noted elsewhere.
“Alternative medicine” is important, because you are messing with people’s lives. I have lost more than one friend because they were persuaded to take woo remedies instead of getting proper treatment. The aforementioned “applied kinesiology” when used to “detect” allergies, for example, might be deadly. As far as I am concerned, there’s a name for “alternative” or “complementary” medicine that works: it’s called “medicine”. And you find out if it works via clinical trials, systematic reviews of results published in peer-reviewed journals and the rest of the panoply of the scientific method as applied to medications. Homœopathy generally fails on these tests, for example, and its occasional successes seem to rely more on “bedside manner” and other placebo-related effects than anything else. Yes, I am aware that “big pharma” pulls tricks on what appears in the journals and so on, but I am also aware that “big alternative pharma” is at least as duplicitous (and big) and two wrongs don’t make a right.
However, I get rather more uneasy when “skepticism” approaches science’s boundary areas. (I am really not sure what the argument is for calling it “skepticism”, by the way: as far as I am concerned it’s simply a US preferred spelling that’s — as often is the case — closer to its classical origin than the way we spell it in Britain. I find the answer given in this article rather weak.)
Parapsychology is a particular case in point. Over the years I have largely overcome my initial dislike of James Randi’s assumptions that unknown things are automatically the result of fakery because he and his associates (see the James Randi Educational Foundation site) are so on the money about so many things, and excellent at exposing the charlatans who are out to make a dishonest buck. But today the attitude there, and in many other skeptic environments, seems to me to be that the paranormal is a con and thus any proper scientific study of it is equally at best not worthwhile and at worst a con too. I am sure a great deal of “popular” parapsychology indeed is. But all of it? Proper “scientific” parapsychology? I tend to think not. You could say exactly the same about psychology, for example, not to mention other “softer” sciences like economics. But few people do.
As far as I am concerned, parapsychology is a real and valid area of scientific research. I am lucky enough to be acquainted with two people with PhDs in the field, and although they came to rather different conclusions about it (and I believe do not get on with each other), their work and my own study of publications in the field over some years suggest to me that it really is worth proper research. I am also aware that there have been dubious pieces of work in the field over the past century — as there have been in a great many areas of scientific discovery — and the odd bad apple is not a good reason to denigrate an entire field.
The big problem in parapsychology, it seems to me, is that while, over a century ago when the paranormal first began to be studied scientifically, the big question was, “Do psychic powers and/or phenomena actually exist?”, the answer today, as it was then, is, “We simply don’t know”. That must be a rather depressing conclusion for parapsychologists: that their field hasn’t got anywhere since the foundation of the Society for Psychical Research in 1882.
Susan Blackmore (whom I recall, hopefully correctly, as being responsible for the above observation) is no longer working in the field (today she works in consciousness studies), but her account of her experiences in parapsychology, In Search of the Light, is definitely worth a read.
I would be very surprised if she was of the opinion that the paranormal was a scam and that everyone working in the field was to be vilified and treated as a charlatan. As far as I recall, her last word on the answer to the Big Question of parapsychology was indeed “We don’t know” — despite the fact that she encountered her own share of dubious research during the time she was involved. Parapsychology research inevitably involves a lot of statistics, and occasionally people fiddle the numbers. I seem to recall that the odd astronomer and medical researcher has been known to do this too, however the result has not been to deprecate astronomy or medical research. Instead you simply tackle the perpetrators, who are in a tiny minority.
Thus I find it annoying, to say the least, when “skeptics” take the position that we know the paranormal doesn’t exist and that it’s all charlatanism. It’s simply not the case: we do not know that. It isn’t even that there’s no evidence of psychic phenomena: it’s that the evidence is inconclusive. That is not the same as saying it doesn’t exist. There is perhaps an argument for looking at what is most likely to move the field forward from the current situation of what might appear to the lay observer to be an impasse, but I am sure parapsychologists have plenty of ideas in that subject.
There are other areas, and people working on the fringes of science who have not been treated particularly well, and, I think, undeservedly. It’s been suggested that Dr Rupert Sheldrake was dishonestly treated in the making of Richard Dawkins’ series Enemies of Reason. Lynne McTaggart, author of The Field and The Intention Experiment, who may be known to many people via the film What the Bleep… has been taken to task by Ben Goldacre as a result of what she claims was an error by someone else , followed by unwarranted criticism.
Now, I have a lot of time for Ben Goldacre. I put up video of his excellent presentation at last year’s OpenTech conference and I’ve sent him funds to support his Bad Science web site. I think that by and large he does a wonderful job. But he does seem to me to have overstepped the mark here. Equally I also have issues with interpretations of modern science — of quantum mechanics in particular, such as those of Fritjof Capra or those in What the Bleep… — that go beyond those of most reputable scientists in the field. But… I’ve never liked the Copenhagen Interpretation and prefer the Transactional Interpretation of Cramer, which is hardly mainstream, so who am I to talk.
Science has dramatically increased our knowledge of how the Universe works and without it we would be in a state worse than the Dark Ages (it’s also got us into some big trouble, but that’s not what we’re talking about here). It’s one of the tools to help us demolish superstition and especially, in my view, the dangerous, destructive, evil and deadly superstition of religion.
But science does not have all the answers and never will, because there is always more to discover. In addition, science moves forward by new hypotheses being presented, and tested by experiment, that give us answers that fit the facts better than what we previously thought. The last thing it needs is to not look at something because an a priori judgement (ie one that doesn’t involve doing any actual science) asserts that said ‘something’ doesn’t exist.
Just because you can use fakery to make something appear to exist (such as a psychic ability), it doesn’t mean that it doesn’t exist. You could use fakery to appear to send an audio message from here to the other side of town, but that doesn’t mean that telephones are impossible. It doesn’t even make them less likely. And don’t give me any of that Occam’s Razor stuff.
Occam’s Razor in essence suggests that the the hypothesis embodying the fewest new assumptions is most likely to be the correct one. To most people, the idea of telepathy, particularly in association with telephone calls, is rather familiar, so the idea that you might guess correctly who is calling you on the phone via telepathy is not an unlikely hypothesis at all (let’s not get into whether it’s telepathy or clairvoyance now, thank you). That it is regarded as unlikely to be thought possible by scientists might result from the fact that they know more about how things work than the lay-person, and thus have a better idea (public opinion is so wrong on so much science); but it could equally mean that they don’t regard it very highly because it’s not currently favoured as an explanation. In which case, how are you going to find out if it ought to be favoured if you don’t look, and say instead (without having looked) that it must be something else? There is something circular here.
The hypothesis we consider to be the most reasonable may depend on what we know, but that really isn’t sufficient. To re-wire a previous analogy: if, during the 19th century, I told you I could transmit a sound message instantaneously from here to the other side of town, would the idea that I might be using a new, currently unheard-of invention called the telephone be the hypothesis embodying the fewest new assumptions? I don’t think so. It would, however, have been the correct one.
It seems to me that in parapsychology, as in other “fringe” areas, you need to prove things a lot harder than you would in more conventional fields, and this Occam’s Razor thing is the reason. If ordinary scientific standards of proof held for parapsychology, there would be no question that it exists. However because the claims made are extraordinary, the proof must be extraordinarily rigorous too. I am not entirely sure that this attitude is justified, especially when it seems as if special efforts are made to ensure it stays that way. It becomes a self-fulfilling prophecy. Extraordinary to whom? To people who have already made up their minds. If the evidence is inconclusive (which I believe to be the case in parapsychology) rather than non-existent, then what’s required is better, more rigorous experimentation, not no experiments at all.
There’s an interesting discussion between Dr Sheldrake and Dr Richard Wiseman which mentions this topic on the Skeptiko website. And again, interestingly, Dr Sheldrake appears to encounter a rather unhelpful attitude to open investigation from Dr Wiseman, the latter again being someone I normally have a great deal of time for. It really pisses me off when people I regard highly seem to me to “let the side down” in this way (Dawkins, Goldacre, Wiseman, I mean you).
We really need to be careful about this stuff. We do need to be open to new ideas and not entertain a fixed, inflexible view of the way the Universe works: that way lies scientism, a perversion of science into dogma that is as far from the scientific method as is religion. We need to be searching for the truth, not trying to score a point (I hate it in politicians: I hate it in scientists). We need to avoid setting arbitrarily high hurdles for proof just because we don’t like what is attempting to be proved: the reasoning behind such apparent evidential prejudice has to be sound and transparent.
Here’s Sheldrake on “Skepticism”:
“Healthy skepticism plays an important part in science, and stimulates research and critical thinking. Healthy skeptics are open-minded and interested in evidence. By contrast, dogmatic skeptics are committed to the belief that “paranormal” phenomena are impossible, or at least so improbable as to merit no serious attention. Hence any evidence for such phenomena must be illusory.”
Now don’t get me wrong: most of the time I’m with the “skeptics” — even if they can’t spell. But what I would not like to see is for the word “skeptic” become synonymous with what McTaggart calls “Bullyboy Science”. Instead I would advise true “sceptics” to do their best to avoid dogma and keep an open mind.
An interesting response to the apparent overenthusiasm in the skeptic camp is the establishment of the web site Skeptical Investigation, which attempts to redress the balance somewhat. It has five sections covering “investigating Skeptics”, “Controversies”, “Open-minded Research”, “Scientific Objectivity” and “Resources”. I by no means go along with everything on the site, but it is very much worthy of study. Approach it with an open mind, wontcha.
Further reading:
September 18, 2010 Comments Off on When does ‘Skepticism’ become dogma?