Sunday, November 30, 2008
In the meantime, we called a national plumbing chain (I won't mention their name) and they sent someone out who gave us a free estimate of $1300, but who knows whether they would have put a good water heater in for that price? In any event, that's way, way too much, so they were dismissed.
Saturday came around and I decided that it was just too likely that the water heater had to be replaced. It was 11 years old, and I strongly suspected that it wasn't the 12 year warranty kind. One thing you can take to the bank is that an N year warranty water heater will last for about N years + 1 day before giving out. So I decided to attempt to save some money by at least starting the demolition of the old unit myself. It was pretty easy, and I managed to get the old unit out all by myself. The gas and water connections came off with a wrench. The T&P relief valve had a sweated connection that I needed to desolder before I could unscrew the rest of it. Lastly, the chimney was held on with a couple of sheet metal screws.
We went to Home Depot and bought a 12 year warranty 40 gallon natural gas heater. It was about $550 or so, but the extra-cheap 6 year warranty units were about $400, so I think that's pretty clearly money well spent.
In retrospect, I probably could have installed the new unit myself. The new unit is a little taller than the old one was, because between then and now the building code has an added safety requirement - a special sealed combustion chamber that has a spring-loaded door held open by a thermal fuse. The idea is that if the burner area overheats, the door will spring shut cutting off the combustion air flow, choking off the fire. I had Gus install the new water heater mostly because of the fear that the size difference was going to make a difference between it being easy and being hard. The worry was that the water connections were going to need to be moved, which would have involved tearing out some sheetrock in the back of the water heater alcove. But there are flexible copper pipes and there was enough flex left for the new heater to fit. The only other work needed was to trim the chimney to fit and to re-plumb the T&P relief piping.
The last advice Gus had was to keep the home depot receipt, because Home Depot has a reputation for doing anything they can to low-ball you on any warranty claims. I stapled it to the door of the water heater closet. He also said we should drain it once a year, but that didn't mean emptying it all the way - just opening the drain valve and pouring off a gallon or two is enough.
Thursday, November 27, 2008
Wednesday, November 26, 2008
My hobby is building small airplanes and one of my favorites is a Davis DA-2A, winner of the Outstanding New Design contest in 1966, the same year my Oldsmobile (and my current Thunderbird convertible) was built. That little Davis can teach us a lot about cars.
I didn't build my DA-2A, but I am rebuilding it right now and know it intimately. My Davis is an all-aluminum two-seater with an 85-horsepower engine. The engine was built in 1946, the plane in 1982, and the whole thing cost under $4,000 at the time, though today I have more than that invested in the instrument panel alone. The plane weighs 625 lbs. empty, 1125 lbs. loaded, has a top speed of 140 miles per hour and can travel about 600 miles on its 24-gallon fuel tank.
Why can't I buy a car like that?
Um, because a fender-bender would kill everyone, stupid!
Cars used to be made like that airplane of yours. Then Ralph Nader wrote a little book called "Unsafe At Any Speed," and the fit hit the shan.
The threat model for airplanes and cars are entirely different, which requires entirely different priorities. Cars need to be crashworthy, because crashes are very, very frequent compared to airplanes.
Those who can't, teach. Those too stupid to even teach, commentate.
Sunday, November 23, 2008
Saturday, November 22, 2008
It's just not quite enough to display
So 10 watts of output power, or 50 watts of ERP don't hardly get you much, I guess.
Friday, November 21, 2008
I've talked about 8VSB quite a bit on this blog. It is 8 level Vestigial SideBand modulation. Reducing it to its most basic description, you take a carrier and amplitude modulate it with a square wave that has 8 different legal amplitude levels. The result is a tremendously wide double-sideband signal. You then pass that signal through a Nyquist filter that reduces the signal down to 6 MHz of bandwidth and that's what 8VSB is. In actual fact, most modern 8VSB modulators don't actually work that way. Instead, they directly synthesize the equivalent waveform instead, but the net result is the same. The symbol rate of 8VSB is about 10 megabaud. And this is fundamentally why 8VSB is vulnerable to problems with multipath. There are 10 million symbols per second, meaning that the sample interval is only 100 microseconds long. In general, the shorter the sample interval is, the tighter the tolerances are.
QAM stands for Quadrature Amplitude Modulation. With QAM, you take a carrier and amplitude modulate it and phase modulate it at the same time. It's thus a combination of AM and FM that happen simultaneously. As with 8VSB, there is a sampling rate when the receiver must determine where in two dimensions the signal exists. In general, there are 2^n different amplitude and phase shift values in a "square" QAM constellation that has n different points on each axis, which yields n bits per baud. The more complex the constellation, the lower the symbol rate can go for the same bit rate. But the more complex the constellation, the higher your requirements for S/N become so that you can distinguish the different points from each other. Most cable companies run QAM-256, which is quite a complex constellation - each point encodes 8 bits of data. They can get away with 38 MB/s in a 6 MHz channel because of the lack of multipath and the high S/N ratio typical of cable delivery. You can run QAM over the air, but you would typically do so with only a 16 point constellation, which would net you a much lower channel bit rate than QAM-256, all else being equal. That said, if you were to attempt QAM-16 at ATSC's 19 MB/s data rate, the baud rate would be lower than 8VSB because each baud encodes 4 bits of data instead of only 3.
You might ask about 8VSB and its "constellation." With 8VSB only the amplitude of the signal is used to encode information. By coercing the waveform into a narrow bandwidth, we must give up any semblance of control over the signal's phase. As a result, when plotted on a constellation display, 8VSB's constellation consists of 8 vertical lines. Thus, each baud contributes 3 bits of information, which is why the baud rate of 8VSB must be so high. In theory, you could reduce both the baud rate of 8VSB and the width of the Nyquist filter at the same time, but doing so would make for a mode that wouldn't be compliant with ATSC specifications. By contrast, the DVB specifications incorporate many different modulation schemes, going all the way from 5 to 8 MHz wide.
The third method for delivering bulk digital data over RF is OFDM. OFDM stands for Orthogonal Frequency-Division Multiplexing. OFDM is a divide-and-conquer scheme. The incoming datastream is divided amongst a number of relatively closely spaced RF channels, each of which is either sent with traditional Phase Shift Keying or perhaps a low constellation QAM mode (PSK, however, at its heart is simply a variant of QAM - without any information on the amplitude axis). Because each individual carrier only has a small portion of the data, the resulting baud rate for each carrier is quite low. It does, however, complicate the receiver quite a bit, because it has to be able to receive and decode a number of digital streams in parallel. This is actually less of a big deal than it might seem, however, since it is relatively routine for multiple receivers to operate within a single chip. For example, there exist single-chip multichannel GPS receivers. And, of course, multiple methods exist to synthesize an OFDM "fugue," so to speak. The big downside of OFDM, however, is in its transmitter linearity and overhead requirements (particularly because of its much higher peak-to-average ratio), and its increased S/N requirements for receivers.
Tuesday, November 18, 2008
It's actually two amps, in fact.
The big one only has about 37 dB of gain. My thinking at the time was that its output at maximum input would not be sufficient for my needs. But now, my thinking has changed. If I feed the thing the maximum power from the exciter of 12 dBm average, the output power would be 79 watts average. Well, as we now know, the peak-to-average ratio of ATSC is 5 dB (at least it is 98% of the time), so that's actually 250 watts peak, which is probably too much (we'll have to get the thing on a scope to see for sure).
I bought that amp, plus a companion amp designed to boost the signal from -2 dBm to up to 5 watts. But I think, in retrospect, that given what I've learned that I'll probably just run the big amp by itself.
The first stage amp does have the advantage of having a power control pin that can be used to reduce the gain, but the exciter also has software output power control.
One thing that the big amp lacks is a PTT line (that is, something that can key the bias voltage for the modules on and off). This is something I'll likely need to add myself. The kind folks at the factory said they'd be happy to help with that when I got to that point.
It also lacks a cover. I'll need to get some sheet metal bent somewhere to fix that.
First step, however, will be trying it out on the air. But that will have to wait for a 28 volt power supply.
I'd like to publicly thank Ken and Lance - you two know who you are. I'd mention the company name, but I am not sure whether they would appreciate the publicity or not. If they would like a plug, I'll edit this post and add one happily. They are very nice people to deal with.
There is a block of L-MLS allocation that ends at 909.75 MHz. If the last 750 kHz of this grant are in use, then I will have to change from 909-915 to 910-916 MHz. Fortunately, this is still a reasonable choice. On the TVC-9S, it's channel 2 instead of channel 1.
The only issue is the NARCC band plan for 33 cm, such as it exists. That band plan has ATV at 909-915 and a digital allocation at 915-917.
This doesn't really square all that well with the reality on the ground, given that potentially the range between 904 and 909.75 Mhz is the M-LMS block A allocation.
The NARCC plan also has an ATV channel on 922-928 MHz, which means it overlays the repeater output band at 927-928 MHz. That doesn't make sense.
The Mt. Diablo ATV folks have a repeater output from 918-924 MHz. This makes sense.
What would make sense in addition would be for NARCC to move the digital sub band from 915-917 to 916-918 MHz, move the lower ATV channel from 909-915 to 910-916 MHz, and recognize the reality that M-LMS probably denies us access to 904-910 MHz.
What am I going to do?
Well, it rather depends on whether there are current digital users between 915 and 916 MHz here in the Bay Area and/or the M-LMS license holder is actively using 909.0-909.75 MHz.
I can only hope and pray that both aren't true. If they are, then I will have no choice but to operate on 910-916 (since I can't interfere with M-LMS) or share 918-924 with Mt. Diablo. Both of those options would likely suck.
Monday, November 17, 2008
97.305(c) lists on a per-band basis the parts of 97.307(f) that apply. 97.307(f)(8) applies to all amateur allocations above 51 MHz (except for the 219-220 MHz 1.25m sub-band). 97.307(f)(8) says,
(8) A RTTY or data emission having designators with A, B, C, D, E, F, G, H, J or R as the first symbol; 1, 2, 7 or 9 as the second symbol; and D or W as the third symbol is also authorized.
The emission designator for ATSC DTV is C7W. That is, it is a vestigial sideband modulation (C), two or more digital channels (7), and a combination of different information (W). That means that, according to 97.307(f)(8) ATSC is allowed. Of course the lowest band where even a single ATSC channel would actually fit is the 70 cm band, so it is unusable on 6m, 2m and 1.25m.
The full designator for ATSC is 5M38C7WWT - the 5M38 indicates that the bandwidth is 5.38 MHz (though the channel is 6 MHz wide, there are 300 kHz of guard band at each end of the channel). The last W indicates a combination of video and audio and the last T indicates that the multiplex is via time-division.
As a starting point, there's this chart from PC Electronics (go to the end of the 2nd page).
To use this chart, you pick the diagonal line closest to the ERP of the transmitter. In the case of my test next Saturday, I'll be using 50 watts ERP, so you'd pick the 20 watt line. Next, you need to determine the delta between the real ERP and the power of the diagonal line in question. In my case, 50 watts is 3.9 db up, so you can round that up to 4 dB. Next, take the gain of your receive antenna. Let's say, just for the sake of argument, that you have a typical 5 dBd mobile whip antenna. Add 4 dB to 5 dB and you get 9 dB. Now, follow the 20W line on the chart over to where it touches 9 dB. The answer is about 18 miles. That's about how far away from Redwood City is from Mt. San Bruno. So if I was transmitting analog ATV with 50 watts of power (remember, that's *peak* power at sync), then you'd get a P5 picture in Redwood City, assuming that your mobile whip antenna had a line of sight to the mountain.
Of course, as we've discussed, power in ATSC is measured as average power, not peak. The broadcasters are using about 7 dB less power to cover the same service area, but that's peak NTSC vs average ATSC. In fact, the true PEP of ATSC is 7 dB above its average - which is coincidently the average delta around here between the analog and digital broadcasters. So I could say, with a straight face, that my 50 watts of ERP is actually 250 watts peak ERP. If you were to run that same exercise on 250 watts instead of 50 watts, you'd find that the range is off the chart - meaning that in real terms the range is limited by topography first. Is that really going to be the case with amateur ATSC? I rather doubt it. But one of the things we need to do is establish the ATSC equivalent to that chart - that is, to determine exactly what our coverage is with a given power level. And if we get a reception report from Blossom Hill or Los Gatos, well, that'll be a pleasant surprise.
Yes, it's the wrong band, but you can compare coverage for 900 MHz by simply subtracting 6 dB from the transmit ERP and using the same chart, as the accompanying text says. It looks like the new target average power output from the 33 cm transmitter will be 75 watts. That's 1.2 dB down from 100 watts. Add that to 6 dB for comparing 70cm to 33cm and you wind up with almost exactly the same number as the KP-20's antenna gain. So the 100W line on that chart will show range for P5 LOS DX with the receive antenna gain being the number you look up on the left side axis. So if you also use a KP-20, you'll find that you should be able to receive from about 30 miles away. This does, however, presume quite a lot: That P5 DX is equivalent to ATSC DX despite any part 15 QRM on the band.
Once again, to receive, you'll need either a computer tuner like an HD Homerun that can tune 420-426 MHz without help or you'll need a standard ATSC TV or tuner connected to a downconverter like the PC Electronics TVC-4S (tuned to channel #1). If you try tuning it in, please be sure to drop a comment regardless of whether you got anything or not. Part of the purpose of these tests is to see how far it can go.
I'll see if I can borrow a 2M HT to take with me. If I can find one, I'll hang out on WB6OQS/R (146.76(-), PL 151.44).
I won't need any help up on the mountain - I'd rather people help by trying to receive the signal. Even if you can't actually decode it, if you can look for it with a spectrum analyzer, that would help. If you attempt to look for it with an analog TV, you'll just see white noise that's difficult (if not impossible) to distinguish from the noise floor.
Aside from PC Electronics, you can find downconverters at North Country Radio, but theirs is tuned with a potentiometer, rather than a PLL, so I'm not sure if frequency stability would be an issue for an 8VSB tuner.
If you are using an HD Homerun, you can use hdhomerun_config to set /tuner0/channel to 8vsb:423000000. No downconverter necessary.
In short, the peak-to-average ratio for COFDM is at least 2 dB greater, and the receiver needs about 4 dB more signal to receive. In other words, you need to pump out 6 dB more signal for the same coverage - that's 4 times more power. The article also mentions that COFDM is more sensitive to QRM than 8VSB. In the past, 8VSB used to have a harder time with multipath, but modern receiver chipsets have come a long way in ghost cancellation, and have largely caught up with their COFDM counterparts.
The reason we have a different standard of broadcasting than the Europeans do is due to the very different nature of broadcasting in Europe. In general, in the different countries in Europe, they have a relatively small number of broadcasters that have nationwide coverage. It would be as if NBC, CBS, ABC, Fox and PBS all had one channel that was replicated across the entire country with no local programming of any kind. The broadcasters in a particular country serve the whole country with multiple transmitters, obviously. With the advent of DVB-T, design decisions were made (by picking COFDM) that allow for multiple transmitters to operate on the same channel without interfering with each other - a so-called single frequency network (it's not really a single frequency as much as a single channel, but that's just a minor detail). Here in the US, Each individual broadcaster has a single transmitter (yes, some of them have translators, but those are special cases), and those single transmitters cover a much, much wider area than the individual transmitters in Europe. For US broadcasters, the higher peak-to-average penalty and the reduced range of power of COFDM would have been a bitter pill to swallow - particularly given the fact that broadcasters here actually pay for the spectrum rather than getting handouts from the government. Given the different reality of how broadcasting works here, it's no surprise that the FCC chose 8VSB.
I mean, to my mind, it's still moot: If the North American market had thousands and thousands of COFDM receivers, then that's what I'd be planning to use. The decision to go with 8VSB in the broadcasting universe was, for us amateurs at least, serendipitous.
This puts the impending arrival of the 200W amp in a different perspective. 200/3 = 66 watts, which is likely what I'll be able to get out of the thing and still be reasonably clean. It may have more headroom above 200W. If it could make it to 250W without clipping, then that would be 83 watts, but I still think I'd be happy with 66. That's 330 watts ERP.
Sunday, November 16, 2008
I was also able to use his wattmeter to confirm the output of the 70 cm amplifier on ATSC, though the issue there is that not all of the power coming out of the amp is actually useful signal, if you're not careful.
My experience with ATSC makes me wonder whether part of that 17 watts is out-of-channel crap that isn't contributing anything useful to my receive signal at the repeater. But this is FM TV we're talking about. Even a class C amp should be fine.
I also am not getting a P5 picture back down from the repeater. I'd say it's P4.5 or so. I suppose some of the imperfection I'm seeing is on the downlink side. I am, however, happy to just get in.
Driving along Skyline from highway 17 to Page Mill Road is no better. But it looks like there is a road running up to the summit of Black Mountain. The Longley-Rice analysis for 50 watts ERP of 423 MHz from up there looks pretty decent. Maybe that will be a good place for the test transmission(s).
In these views, the reference level is actually 20 db higher than listed, because there is a 20 dB attenuator on the input to the SA.
That's the output of the amplifier at minimum driver power - which results in about 6-7 watts of channel average power output. The skits that are visible at about -25 dBm were present in the output from the exciter, but were slightly lower (maybe 5 dB) relative to the main signal.
That's the output at one notch up from the minimum - which is about 10-12 watts of average power or so, but it's a bit uglier.
Saturday, November 15, 2008
I assembled the amp without any changes in bias (they include some extra resistors you can use to reduce the bias voltage if you want), and without the load resistor on the input (to maximize the sensitivity). The amp is unbelievably sensitive in this configuration. Turning the power up past 4 (on a scale of 1 to 15) saturates the amp (that is, starts increasing the out-of-channel skirts without appreciably boosting the amplitude of the main signal). But so far as I can tell, I'm still getting about 10 watts of channel average power, or 20W PEP (remember, the peak-to-average ratio for 8VSB is 3 dB).
Even at minimum power, the out-of-channel skirts are a bit higher than I would like to see, but it's nothing that a mask filter can't fix, if I were going to run on 70 cm a non-trivial percentage of the time.
Tuesday, November 11, 2008
Monday, November 10, 2008
To just go up on top of a hill and transmit DTV is one thing, but again, the goal here is a repeater.
Pieces still needed:
The final amplifier. I have a line on a 100W model, which would be a boost compared to the DEMI 70W (which I'll be fortunate to get 40W and still have it be linear). 100W into the KP-20 I have would be half a kilowatt ERP! From Mt. Chual, that would almost certainly blanket the south bay nicely.
But it will need a 28 VDC power supply, and I'll then need a 28-12 volt DC-DC converter (@ 1A) to power the rest of the gear.
I'll be needing to place another order with SR-Systems - this time for a 4:1 TS MUX, another MPEG encoder and a DVB-T receiver for 426-431 MHz.
A 2 meter NBFM receiver for a 2 meter talk-in / control channel
A tower camera
A 1.2 GHz FM TV receiver
A Diamond X-6000 receive antenna and a 3 port duplexer (2m, 70 cm, 23 cm).
And lastly, I will need to build a repeater controller. For this, I'll use an Intuitive Circuits OSD generator, a Javelin Stamp, a DTMF decoder (for control) and a 567 wired as a horizontal sync detector (to detect video input from the FM receiver). I'll use the Javelin's UART class to bit-bang serial output to the OSD, another output pin as a PTT control line (which really will just key the amplifier - the exciter will free-run so as to reduce latency), and two pins as video-good input signals - one from the 567 for the analog video input, and the other from the DVB-T receiver.
The whole thing will probably draw about 600 watts of AC power while transmitting.
Sunday, November 9, 2008
The difficult part is that it requires a 28 volt power supply. DuraComm makes one, but then I'll need a DC-DC converter to come up with the 12 volts to run everything else.
Saturday, November 8, 2008
I hooked the transmitter up to the UHF antenna on the corner of the garage (A Diamond V2000, which has 8.4 dB gain on 430 MHz). I measured the transmitter as best I could on the SA and it seems to be generating about 5 dBm or so. With some feedline loss, I'll assume that the net gain of the antenna system is 8 dB, so that gives us a total of 13 dBm - 20 mW ERP.
I put the 1/4 wave vertical on the van, grabbed the 70 cm downconverter and the Insignia TV and went for a drive.... to the end of the block. Google Earth says that over flat ground, I got a maximum range of about 190 meters. I probably could have shouted that far.
So the distance record for amateur ATSC TV at the moment stands at just shy of two football fields.
The amplifier is coming.... someday....
Friday, November 7, 2008
Some receivers that can't process MPEG audio themselves sometimes pass the audio out to an S/PDIF port, which gives you a 2nd chance to decode it. This assumes, of course, that your tuner has one of those. Coupon Eligible boxes won't.
In general, tuners intended to work with computers will likely work perfectly, as they simply extract the transport stream from the 8VSB channel and expect the computer to decode it in software. Videolan should have no trouble, so long as it supports your device. Software that comes with the tuners should generally work as well. One big exception is EyeTV. When I've manually tuned the HDHomerun to the transmitter output, EyeTV has simply said that nothing was there. It's possible that EyeTV is insisting on finding the PSIP tables, but I haven't really dug into the problem to figure out what's wrong. I've simply used hdhomerun_config (downloadable from Silicon Dust) to capture the stream and MPEG Streamclip to convert it.
Unless it's otherwise mentioned, all of the tuners listed will need a downconverter for any ham band.
Silicon Dust HD Homerun: works perfectly, and can tune 70cm frequencies without a downconverter.
Samsung SIR-T451: Video good, no audio. I haven't tried the S/PDIF port.
Insignia NS-7HTV LCD TV: Video and audio both work.
Samsung LN-T4069F LCD TV: Video and audio both work.
Insignia NS-DXA1: Video good, no audio.
Obviously, I can't really ask people for reports with their own receivers yet, since I can only put out a couple of mW of power. But once I can get a significant amount of power out, I'd like to hear from owners of as many different receivers as possible to expand on this list.
Thursday, November 6, 2008
So a little edit. Subtract 9 MHz from all of the 33 cm frequencies I've mentioned so far. :)
I got a PC Electronics TVC-9S 33 cm ATV downconverter yesterday and this evening I was able to conduct a complete end-to-end test at 918 MHz. I got a little rubber-duck 900 MHz antenna for the downconverter and fed the transmitter into a Comet KP-20. I had to turn the transmitter power down to minimum, but doing so allowed the HD Homerun to capture the stream without any errors. Not only that, but I was able to plug the downconverter into my little insignia portable LCD digital TV, and that little TV not only was able to show the picture, but the sound worked too!
In fact, the comparisons are pretty apt. The country was in bad shape, both at home and abroad at the end of Jimmy Carter's term as president. The country voted for change with both feet and elected Ronald Regan. He presided over the biggest peacetime economic expansion the country has ever known, and over the demise of soviet communism. Well, at the end of the Bush presidency, the country's in the shitter again, and the country has voted for change once again with both feet and elected Barack Obama. It will be interesting to see where we are 4 and 8 years hence in comparison to where we were in 1984 and 1988 relative to 1980.
I think either way it'll be telling. If things do improve, then it proves that the republicans and democrats really are highpopalorum and lowpopahighram, just like Huey P. Long said they were. If things don't improve, then I'll look back and say, "See? They said that if I voted for John McCain, the country would still be in the shitter and they were right: I voted for John McCain, and the country is still in the shitter." (with appologies to the old saying about Goldwater and the Vietnam war).
Wednesday, November 5, 2008
I found a PDF that has an explanation.
A spectrum analyzer is a radio receiver, whose output is hooked up to the Y axis of an oscilloscope. The X axis is a sawtooth wave, like a normal oscilloscope, but in addition to sweeping the X axis of the display, it also adjusts the frequency of the receiver.
There's a bit more to it than that, however.
One of the characteristics of the radio receiver that's part of a spectrum analyzer is its "Resolution Bandwidth." This is the bandwidth of the receiver. That is, in order to plot an amplitude value on the display, how wide of a swath of the RF spectrum do you sample in order to determine what that amplitude is?
The resolution bandwidth of a SA is tied to the span of the display. The larger the span, the higher the resolution bandwidth, which means the wider a signal has to be to actually show up. At the same time, the smaller the resolution bandwidth, the slower the sweep has to be in order to display the finer grained samples.
If the signal you're watching fits inside of the resolution bandwidth, then the entire signal contributes to the amplitude that shows up on the display. But if the signal is wider than the resolution bandwidth, then at any one given moment, you're only plotting a fraction of the signal's power on the display.
It turns out that if you know the 3 dB bandwidth of the signal (for ATSC it's 5.83 MHz), and the RBW of the SA, then you can calculate a correction factor: 10 * log (RBW / Signal BW) dB is how far down your SA will show the signal. That is, you subtract that number (which is probably negative) from the displayed power level to obtain the real one.
I took a picture the other day that showed my transmitter's signal at the 2nd division down from the top, and the reference level was -4 dBm, with 10 dB per division. That's a signal strength of -14 dBm. But the RBW was 100 kHz, so the correction factor is -17.65 dB, which actually put my output power at +3.65 dBm, or about 2.3 mW.
Sunday, November 2, 2008
February 14th, 2009 at high noon, I will be transmitting ATSC (or as close as I can get to it) with as much power as I can muster at 918-924 MHz from the summit of Mt. San Bruno. If equipment and time permit, I will also attempt to transmit on 420-426 MHz. This date in particular was chosen because it is the last saturday before Feb 17th, the date of the broadcast TV analog shut-down.
The plan is to generate about 150 watts of ERP at 918 MHz, and if I can obtain an amp in time, perhaps as much as 100 watts of ERP at 420 MHz.
The 73 cm transmissions should be receivable using a stock Silicon Dust HD HomeRun. You may need to direct it with the hdhomerun_config command to receive 8vsb at 423 MHz, since that's not a standard broadcast frequency.
The 33 cm transmissions will likely require use of a downconverter in front of an ATSC tuner. The PC Electronics TVC-9S set for channel 4 should work just fine combined with an ATSC tuner set for channel 3.
Those using computer connected tuners should have no trouble decoding the transport stream, while those using TV sets or consumer grade tuners may be limited to video only, as the audio will be MPEG audio rather than Dolby AC3 (which is required by the ATSC spec).
Please spread the news as far and as wide as possible. I'm setting this date this far in advance in the hopes of it becoming a highly anticipated event in the amateur community.
I've gone ahead and ordered one of those MMICs. I'm not confident enough of my surface mount soldering techniques to work on a kilo-Euro worth of equipment, but I have some hopes that I'll be able to enlist someone in the hardware lab at the office to help me.
I also hope that the hardware lab can spare the ferrite bead. I think it's silly to try and order a single quantity surface mount component mail order. I bought a 20 dB attenuator in the same order as the single MMIC just to make the shipping costs worthwhile. With the attenuator, I should be able to run the output of my amp directly into the SA for power measurements and stuff (the SA can take up to 30 dBm - 1 watt - of input power).
Saturday, November 1, 2008
Well, it is conceivable that someone could tune this gear up on UHF TV channel 16, but it would be a shocking violation of FCC rules to do so. Nevertheless, if someone were to attempt something so foolish with a tenth of a milliwatt, and attempt to receive it with, say, a Samsung SIR-T451, they'd find that it actually would show a picture! Unfortunately, the audio wouldn't work, since it's MPEG 2 audio instead of Dolby AC3, but the picture would decode properly despite the lack of proper PSIP and despite the audio being the wrong codec.
For what that's worth.
I was able to use MPEG Streamclip to export the video and audio from my first captured ATSC transmission this afternoon and upload the result to YouTube!
I suspect that there are still some tweaks necessary for full HDHomerun compatibility - for one thing, I haven't yet gotten EyeTV to actually bring up the video yet. Probably has to do with the PSIP tables (or lack thereof?). I suspect a firmware upgrade may be forthcoming. :)
The output is -10 dBm, which is less than I had hoped. I will need to boost that, somehow, before feeding it into a DEMI amp. But even so, it was enough to set it running at 420-426 MHz, feed it to my UHF vertical and receive it on the broadcast TV system at -60 dBm! Not only that, but the HDHomeRun actually was able to save the transport stream!
That, unfortunately, is where the good news stops. I don't have the software tools handy to decode that transport stream, probably because I didn't make any attempt to filter the transport stream by program ID or anything.
Advances in the state of the art in any field always obsolete prior technology, resulting in the loss of specialized techniques used to optimize the technology of the time. Case in point: the iambic keyer. It represents the state of the art in optimizing the transmission of morse code. Morse code is an obsolete technology, relegated largely to the amateur radio bands nowadays. With the advent of satellite based search and rescue beacon technology, the last non-amateur use for morse code (namely the maritime service) has gone by the wayside. Apart from Amateurs, nobody therefore has a use for an iambic keyer. And I think it won't be too long before you'll need to go to a museum to see an Iambic keyer - the furthest development of a technological cul-de-sac.
Thus is it with film. When we think of early films, we think of black-and-white silent films. We think of that because we're used to television, and before color television, TV was itself monochromatic (black-and-white is a misnomer: both TV and film offer a continuous greyscale). But, as I discovered last night, early cinema was not monochromatic. Early TV was monochrome because the actual color that the viewer saw depended on the color of the phosphors that were built into his own TV set. Thus, everybody saw exactly one color - which tried to be as close to a neutral grey as possible.
This was not the case for film, however. While the actual photographic process was greyscale, the film stock itself could be tinted. Within the single film we saw last night, I counted at least 3 different film-stock tints. These differing tints were used by the film's creators to change the tone of the scenes. This is something that was impossible for television before the advent of full color broadcasting in the 1960s. When you saw a film on TV, it was greyscale, period (unless you put colored films or other such trickery in front of the tube).
Not only that, but certain scenes in the film we saw last night were actually in full Technicolor! Color photography was in its infancy in the 1920s. It was nightmarishly expensive, but it could be done. In addition, it was possible for much less money to highlight a single color - the Phantom's red cape, for instance - in a particular scene. This was also done in the film we saw last night.
In addition, it was not unheard of for some filmmakers to have certain elements of their films hand tinted. An example of this is still preserved today in the Chriterion Collection edition of Jacques Tati's film Jour de Fête, where the french flag is tinted red and blue.
Lastly, before the advent of synchronized soundtracks, it was customary for the projection frame rate for films to be variable. Usually, instructions were provided to the projectionist along with the reels of film for what speed various scenes were to be shown. Sometimes the projectionist would ignore those instructions and do whatever they felt was right (or perhaps they were just lazy and set one speed at the start). Because of that, individual experiences in viewing a single film could actually vary. Essentially, there is a tradeoff between slower speeds that flicker a bit less and use less feet-per-minute of film, versus faster speeds that make fast action less blurry.
With the advent of synchronized soundtracks, it was necessary to stick with a standardized frame rate (the industry chose 24 fps) to insure that the pitch of the sound didn't vary, but at the same time, one of the tools used for decades to customize the performance was lost.
With modern digital video technology, resolution and frame rate are, once again, adjustable. There's nothing that would prevent someone from varying the frame rate by scene. But the problem is that it likely wouldn't do any good, since most displays simply adapt the incoming material's frame rate to the native refresh rate of the display. Making matters worse, some displays either do a lousy job of this, or fail if faced with non-standard refresh rates. For instance, most cartoons are animated at only a maximum rate of 12 fps (with adjacent frames of 24 fps film being identical), and that's only during action sequences. The reason for this is the enormous cost of animation. It would make sense, therefore, to MPEG encode such cartoons at 12 fps. But this typically isn't done. Instead, the encoding is done at 24 FPS and redundant, empty I frames are sent in the extra time.
With the advent of television and full color movies, these techniques were rendered obsolete. In the case of television, the viewer could only see monochrome anyway, and in the case of movies, full color made the other tricks unnecessary. It is only in experiencing what must be characterized as an early cinema museum performance that we in the audience were privileged to get a glimpse of the highest state of the art of early cinema.