Lee Hart and 1802-based computers vs PICs - Proteus and BASYS From a Lee Hart post in cosmacelf, a Yahoo! group, Sept 4 2009. Lee responded to a discussion about "first projects" using 1802 microprocessors as controllers. Ted Rossin posts in support of PICs, and John Porubek says why he does not like PICs. Other points are discussed, such as looking back at other micro-trainers, and considering programmable devices to emulate one of several microprocessors. The focus of this note, however, is on the 1802 and Lee Hart's considerations of it and designs around it, for use and for self-education. For the full discussion, go to the Yahoo! "group" page to find the cosmacelf group and the SEpt 2009 thread "Tiny ELF, ELF2K, EELF2K, Membership Card. Which to choose..." Content here was edited and made available by Herb Johnson, last edited Feb 16 2012. Made available with permission of Lee Hart. See the Web page "http://www.retrotechnology.com/memship/memship.html" for more information on what became the "Membership Card". Also see the Web page "http://www.retrotechnology.com/memship/mem_basys.html" for more BASYS information. - Herb Johnson ------------------------- [Lee Hart responds to a post requesting advice and direction for a first-time user of microprocessor based controllers, after a story of using a Cosmac ELF in that way.] When I didn't "know any better", I did a number of projects like [yours] as well. With the 1802 at first (the first computer I got to work), and later with other small computers. Matt Biewer (of Prolog) was one of the first people to realize that a microcomputer is a crappy "computer" -- but it is a wonderful programmable *controller*. It is much easier to program, and far more versatile than an opamp or 555 timer or tangle of TTL chips. There are a few requirements to make this work. The micro's hardware and software need to be so easy to use that you can almost ignore them, and simply concentrate on the problem. Like using a hammer -- you pay no attention at all to the hammer, and just focus on the nails and the boards. With most computers, you are forced to spend so much time "screwing around" with the hardware and software that you never do get any useful work done. It's like trying to use a gold-plated ultra-high-tech robo-hammer with LCD force gauge and laser-guided striking head. It takes you an hour to set it up and figure out how to use it, just to drive one nail. When I was at Technical Microsystems, I designed a little box I called "Proteus" that I used for all sorts of quickie testing and measurement functions. It was an 1802 computer in a box, with all its power supplies and I/O pins brought out to a solderless breadboard socket on top. It was programmed with FORTH, which is exceptionally easy to use for quick test and measurement applications. If I got some new chip I wanted to test, or some hardware gadget I needed to exercise, or some circuit I wanted to try, I wired it up on a Proteus box. I've never found a faster way to get things going. I tried to get it made into a product, but in never happened. Today, the Parallax BASIC Stamps are a little bit like Proteus. But, they don't provide the breadboard socket or power supplies to make interfacing easy. And, their "BASIC" programming language isn't really very basic at all. [One respondant suggested maybe a Basic Stamp, Arduino or similar modern product, but "he wanted to try an ELF computer" because of its simplicity. "...give the 1802 power and a clock, and off you go."] One big difference between a [Cosmac] ELF and the BASIC stamps (and Arduino etc.) is that all these other computers require a PC to use them. The ELF is the only one that works stand-alone; all by itself. An expert that is very familiar with PCs has no problem. But for a beginner who either doesn't have a PC, or has no idea how to use it to talk to some external device, this can be a major roadblock. It leads into a confusing maze: What kind of port does my computer have (RS-232, USB, parallel), where do I get the right cable, do I have software drivers that will talk to to it, does Parallax or whoever support my particular operating system, etc. Then, since they assume you have a PC, they also assume you are familiar with using it and programming it. So, their programming languages and documentation are all written for people with these skills. It looks easy to a computer geek; but can be incomprehensible to someone who has never programmed anything in their life (which means 99% of people today). Parallax has tried harder than most to make their products accessible to beginners. Their Board of Education is pretty good; but it still comes across like a PhD in mathematics trying to teach elementary school math -- he knows so much that it's hard for him to appreciate just how *little* the kids know. "Fine; two plus two is four. What's a four?" To get started with the Board of Education, you really need a *teacher* looking over your shoulder, to explain things or show you things when you get stuck. And of course, you're going to have to go through hours and hours of studying with the Board of Education course before you "pass" and are ready to actually do the project you had in mind. > Give it power, a clock, some ram and a method for input/output and > away you go. No messing around with EPROM's just so you can interface > with it and hand off instructions. Fire up some machine language and > it understands what you want it to do. Yes, that describes the ELF! Start with Tom Pittman's "A Short Course in Programming" (on the www.cosmacelf.com website). Within the first hour, you'll have it sensing switches, making decisions, and turning lights on and off. > I think he needs a "Proteus" type machine too. Basically, that means an ELF with screw terminals and drivers so it's easier to connect big switches and heavier loads. [Ted Rossin posted about using the PIC, and Lee quoted and responded, as below:] ted_rossin wrote: > Unless you want to load your program every time you turn the board on > you will still need to deal with EPROMs or EEPROMs or battery backed > RAM. Sure; but are these so bad? Power consumption is so low that I usually left my ELF on to keep its program. Most of my early programs were so small that reloading them wasn't a big deal. [But] adding a battery backup was easy. Now there are RAM chips with the battery backup built in; no external parts at all. The 1802 can program an EPROM or EEPROM directly. My ELF computers simply wrote to it as if it were RAM (except that the EPROM was one-time write). The circuit for programming a 2716 was just a switch and three 9v batteries to provide the programming voltage, and a 555 timer to generate a 50msec HALT for each write pulse. ted_rossin wrote: > You can still program a PIC or Arduino in machine language and not > use a PC if you really want to go the primitive route. How, pray tell? I know of no way to get code into it without a PC. There are no provisions for external memory, or a front panel. For versions that have serial downloaders, the format is so complex that a computer is required to talk to it. > Retro computing is fun but for new designs one should really move on. A single > chip microcontroller like a PIC 16F628A or 16F876A could not be simpler to use. > Here are some links for those who would like to see what has happened > over the last 30+ years. [microchip.com] > http://www.tedrossin.x10hosting.com/Electronics/Pic/Pic.html No; it is simple to *build* (one chip). It is not simple to *use* (program). Yes, the 1802 architecture and instruction set are primitive. But in this case, primitive also means simple; simple to learn, simple to use, and simple to remember. I can still remember the entire instruction set (both the mnemonics and the binary), and know the architecture by heart. In contrast, the PIC hurts my head; it's like a miniature version of the Byzantine x86 architecture, with all sorts of complex rules and special cases. Ted Rossin responds to Lee's critique: [About power consumption,] I've had the same 2 AA batteries powering a PIC for 5 years now. When it is not flashing LEDs to spell out words when somebody shakes it, it goes to sleep and consumes 700nA! [Lee wrote:] > I know of no way to get [machine] code into [a PIC] without a PC. There > are no provisions for external memory, or a front panel. For versions > that have serial downloaders, the format is so complex that a computer > is required to talk to it. This is just not true. If you read the programming spec, the serial format is simple enough that you could use TTL parallel to serial shifters and a counter to send the desired commands and data (from toggle switches) to program the thing up. Since it would take so long to flip the switches between commands, there are no need for precise waits. These are only in the spec to speed up programming. Addressing is sequential just like the 1802's DMA mode. Also, since most PIC processor can self write their flash program memory, once you program one up, it can contain code to implement a front panel. I only program a PIC once though the normal programming port with a tiny boot-loader that allows me to modify code through an (on-chip) RS-232 port. There are sources that will sell PICs pre-programmed with such code or you can buy an evaluation board for $30 with a PIC and USB cable so you can download your own or other's boot code. Then, pull the chip out and pretend it is something else. [Lee wrote:] > Yes, the 1802 architecture and instruction set are primitive. But in > this case, primitive also means simple; simple to learn, simple to use, > and simple to remember. I don't follow your logic since the PIC has 35 instructions which is way less than the 1802. I'll admit that it does have crazy new instructions like CALL and RETURN instead of having the user write call and return functions and manipulate the program counter. [Ted listed 14-bit PIC instruction set.] [Ted later wrote that the PICs are very inexpensive and that supporting software hides many machine language programming issues. He also said:] The point I was trying to make is that an 1802 is not a good choice for a new design. I just feel it is like telling someone that MSI (counters, decoders ..) TTL is evil and that you should use transistors, diodes and resistors to solve all their problems because they are fun. So are relays, knife switches and lantern batteries but it is not something I want to do again. P.S. Don't get me wrong. I love the 1802 and have kept mine going since I first built it on perf board in 1977 from the Popular Electronics article. I even have a little page devoted to it: http://tedrossin.x10hosting.com/Electronics/RCA/RCA.html [Others in cosmacelf argued the 1802 instruction set is less primitive than the PIC's. One posted argument against the PICs was that they are a "dead end" of variety for problem-solving hardware and software.] [There was discussion of the difficulties and limitations of PIC assembly language or limited C compilers for them. Also, discussion about about a VHDL or programmable hardware implementation of an 1802, as part of a way to easily run several kinds of processors on one piece of hardware. [Lee responds:] My understanding is that the PIC hardware was originally designed to be as cheap and minimalist as possible. This led to a terrible architecture and poor instruction set. They expected the assembler and compiler to make up for it. "No stack? No problem! Just create one in software..." But as technology marched ahead, it's gotten cheaper and cheaper to bury the software and architectural limitations with more hardware. So now, the PIC is a weak CPU with tons of powerful on-chip hardware peripherals. [As for a] VHDL implementation of an 1802 and peripherals: There's something else that I think would be even better. A CPU with a user-definable instruction set. Way back in the dark ages when I was in college, Xerox PARC was at the leading edge of computer technology. They built a computer called the Dorado. It was the hardware that Apple saw the first GUI running on (bitmapped screen, mouse, ethernet, etc.). Xerox was programming it in Smalltalk; perhaps the first object oriented programming language. The underlying CPU was a 16-bit processor built with dozens of TTL chips. Its instruction set was in RAM, and could be changed! For example, there were instruction sets for various popular computers of the day; Apple II, IMSAI 8080, TRS-80, etc. To run software for some other computer, the Dorado duplicated its instruction set and peripherals to *become* that computer! Thus, it ran these other machine's software *directly*, at full speed, rather than needing to emulate them. If I could write VHDL, this is the CPU I'd use it to create. Then, it could become an 1802, 6502, 8080, Z80, or whatever. [A PLD version of the 1861 by Armstrong was mentioned: http://www.sparetimegizmos.com/Hardware/Elf2K_Accessories.htm#STG1681 ] While it's an interesting project from a technical point of view, what would be the intended application? The real 1802 is still available, and likely to use far less power than a gate array programmed to emulate an 1802. The gate array could be much faster; but if speed is the issue, there are better architectures. (The 1802 takes 16 to 24 clock cycles per instruction). Personally, I like the simplicity of the 1802. So my goal would be to find a way to use *even less hardware* to do the same job. A brilliant step in that direction was the 8-pin PIC that someone (I forget who) programmed to emulate not just the 1802 but the entire ELF! Moving in that direction, could one develop a serial bus version of the 1802 architecture, where one 8-pin chip is the 1802, another is the 1861 (for video output), another is the memory (using standard serial memory chips), and some serial-to-parallel chips for general I/O ports? [In common with a series of "I hate PICs" messages, John Porubek posted:] I'm usually a lurker in this group, but I couldn't resist the urge to weigh in with my opinions. I, too, don't particularly care for PIC micros and I'm not entirely sure why. Maybe it's because the original chips had next to no hardware support for a decent stack. I'm a big fan of programming in Forth for micros, and you can't do Forth without a real stack. Or maybe it was the weird, non-standard assembly language mnemonics, especially if you were used to the 6502/6800 style of mnemonics (or even 808x mnemonics). I know that PICs have come a long way since those early days, but I still feel a small sense of disappointment when I read an article in Circuit Cellar and discover that the project is based on a PIC. It's also a big reason why I keep telling myself I'm going to let my subscription to Nuts and Volts run out next time it's due. Like most people in this group, I got my start in micros with the RCA 1802, building my own improved version of the ELF with muxed data AND address displays and a hex keypad. I was enamored with all things COSMAC and acquired a VIP, a Studio II, and even an RCA Micromonitor. I then progressed to 6502-based computers with an AIM-65 and a Commodore 64 before assembling my first PC. I dabbled a little with the 8085 via a Radio Shack Model 100 and a Kyocera 85. I'd like to get back to all of them someday as part of my own mini-museum - maybe if I can ever retire! I like AVRs from what I've read about them, although I don't have any hands-on experience. A brief professional exposure to an 8051 project was painful, although this may not have been the 8051's fault - it was being over-taxed with bank-switched memory and the code I had to interface with was badly-written C spaghetti code. I have mostly pleasant experiences with 68HC05s, 680x0s and Coldfires. However, my current processor of choice is the TI MSP430. In some ways it could be claimed to be the cultural descendant of the 1802. Like the 1802, the MSP430 is designed for low-power applications, although modern semiconductor process technologies make the MSP430 a far lower power device than the 1802 could ever have hoped to have been. Like the 1802, the MSP430 architecture is based on 16 16-bit registers, although the program counter and stack pointer registers are fixed for the MSP430, unlike the 1802! There are versions of Forth available for both the MSP430 and the 1802 at http://www.camelforth.com. There's even a version there that I ported for the TI eZ430-RF2500T low-cost USB stick development tool. Mark Graybill, your website looks really interesting, especially your 8085-based micro trainer [at http://saundby.com/electronics/8085/] I plan to spend some spare time looking through all that's available there. Thanks for putting it up on the web! Lee, I used to dream of owning a Proteus box! Or, more likely, building my own as I couldn't afford/justify the expense of buying it. I still have the product data sheet in my "project idea" folder. I even bought one of your BASYS boards at the Rochester (NY) Hamfest and tried to wheedle information out of you about your version of Forth for the 1802 - you called it 8th if I recall correctly. I never populated the board, but I still know where it is and have all the parts for it! - John Porubek Lee responds: John, thanks for your kind comments. If you still have the board, there's still hope! :-) All the parts are still pretty easy to find. [To use an MSP430 running Forth], that makes sense, too. Have you looked at Charlie Moore's ColorFORTH? It's a *very* interesting concept! He's only implemented it on a Pentium-class PC, but like all FORTHs, the concept could easily be brought up on any micro. Lee responds to Mark Greybill's 8085 trainer:] Mark, this 8085 trainer is a really neat project. What's especially nice is how you started it "from scratch". Build it on a breadboard, CPU only at first, then add memory and I/O as you went along. No need for a PC, fancy test equipment, etc. You "lifted it up by its bootstraps". This is easy to do with an 1802, and not too bad for some of the older CPUs like the 8085, Z80, 6502 etc. But it has become very difficult with many of the newer chips. Mark Graybill wrote: > Thanks, Lee. > It's not an Elf, but I'm trying to keep it as simple as possible... Mark, I think you've done very well! If the goal is to teach beginners about computers, it has to be a very small, simple, straightforward, almost trivial system. It needs to teach the basics quickly, so students don't get bored. Yet it needs to be rich enough so they can continue to learn and grow. And, it can't be too far out from mainstream computing, or what they learn won't be directly applicable to 'big' computers. In brief, they want to climb the ladder quickly; climb high, and they want it to get them where they want to go. I've given this matter a lot of thought. I'm concerned that today's computers are so complex that beginners never learn the basics. And without a solid foundation, their solutions are overly complex, expensive, and unreliable. Charles Moore (inventor of FORTH) said, "I despair. Technology, and our very civilization, will get more and more complex until it collapses. There is no opposing pressure to limit this growth." The original RCA Microtutor is what got me started. It inspired the Popular Electronics ELF, and all that followed. Like your 8085 trainer, it was so simple that beginners could build it, understand it, and expand it considerably. For example, adding video with just one chip (the 1861). Or adding a couple memory chips, and having a high level language (Tiny BASIC). But due to the technological limitations at the time, it was hard to expand the ELF beyond a certain point. Serious amounts of memory, decent video, or mass storage required many expensive chips. Higher level languages (Pascal, C) were unavailable or very expensive. I think it would be enlightening to consider what a modern Microtutor/ELF might look like. Big memory chips are easy to get, but peripherals are still a problem. A PC keyboard is easy, but the software to make it work is hard. There aren't any single-chip video systems that I know of. There are disk controller chips, but they aren't easy to interface. Two ideas come to mind. One is to use preprogrammed single-chip micros to simulate the desired peripherals. Yes, it's are a micro; but treat it as if it's a piece of hardware. Another is to use a serial bus to tie the various chips together. This vastly reduces the number of lines a beginner has to wire. There are serial memories, serial I/O chips, etc. So I can envision a computer that consists of a few 8-pin DIPs -- one emulates the 1802, one is the nonvolatile memory, and one has the video output and PC serial keyboard input. Software: I know C and Java are currently in fashion, but I don't see them as good beginner languages. I'd rather see something like BASIC, FORTH, or Logo. Logo's strength is that it is specifically meant for beginners, and a lot like modern programming languages. FORTH's strength is that it is excellent for I/O intensive applications. Charlie Moore's ColorFORTH has a lot of interesting ideas that could be used. [There was discussion of a PIC processor with small pinout, and a serial/parallel chip to permit a binary display and set of switches to run it. The 1802 simulation code was based on prior work, as discussed in the thread "Microtutor II: a modern microcomputer trainer" for Oct 2009 - Herb Johnson] Maybe; but I don't that will fly with today's internet-jaded kids. But, did you see what they did with Logo and its turtle robots? You put the tiny computer on a robot. The input switches are labelled with symbols for Forward, Back, Left, Right, Grab, Release, etc. And, keys for Remember and End, and a numeric keypad. You start them off even before they can read, with simple sequences. To move in a square, press forward left forward left forward left forward left and the robot is back where it started. Next, they can have it remember sequences of keys. remember 1 forward left forward left forward left forward left end Now when they press "1", the robot follows the whole sequence to move in a square. They built up from this humble beginning, into a full-blown programming language. It taught math; for example 5 forward meant go forward 5 units instead of just one. See the classic book "Mindstorms" for more. So, I would envision something like an ELF on wheels. The switches are the initial way to give it directions to move. As you program it more, it gets "smarter". :-) Dave Ruske wrote: > Those "turtle" robots pre-date Lego Mindstorms by quite a bit, though > similar things have been built with Legos: > > Excellent links! Thanks, Dave. Truly amazing things can still be done with incredibly simple hardware and software. Today's Logo turtle robots are things like the BeeBot http://www.terrapinlogo.com/bee-botmain.php or ProBot http://www.terrapinlogo.com/pro-bot.php This is the sort of thing I'm imagining, but with a micro that the student can access at the hardware-level (add his/her own hardware), and with software that allows machine-level access to their hardware. [Charles Richmond wrote: > Is this the Lego Mindstorms robot???] No. *LEGO* Mindstorms is an expensive highly commercialized toy. It is based on the popular Lego building blocks, but adds motors and switches so kids can build models that move. It has a lot of educational possibilities, by connecting its motors and sensors to a PC. But its scope is somewhat limited; like most modern Legos, they expect you to build the specific models that they designed, in certain predefined ways. The software they supply leads you down a very controlled path. There are a large number of small, easy-to-lose parts, that all have to be bought from them. *LOGO* is a much older computer programming language, specifically created for educational purposes. It was popularized in the 1980's by the book "Mindstorms" by Seymour Papert. Logo is a completely modern, full-featured programming language. However, it has a tiny core of commands that are designed so even children who can't read can learn it. It's typically taught by beginning with a "turtle robot", which could be a physical robot, or a representation of one on the computer screen. As students progress, they learn ever higher concepts in logical thinking and computer programming. I believe they have been continuously available even before [the LOGO robots of the early 1980's], right up to the present. There have been many of them, from several vendors, in several different formats. Milton Bradley made a toy, the "Big Trak" which was essentially a turtle robot with a space-age car body and built-in Logo (see http://en.wikipedia.org/wiki/Big_Trak#Soviet_clone). I think turtle robots and Logo are one of those classic "great ideas" that nevertheless seem to be overlooked by people. [KB wrote: > I still like the Idea of an ELF on wheels. Couldn't we get a low cost > chip (aka one of those sample ones) and give it a program to emulate > the ELF? And then put some wheels on it and write a kid friendly > guide. I think that would be marvelous!] Yes; that's the direction I'm leaning. A real 1802 is an expensive scarce 40-pin chip. A PIC programmed to emulate an 1802 is much cheaper, and with a serial bus is much easier to wire up. Someone (I forget who; will have to look it up) has already done most of this. As for a robot, I built one I called "Itsabox" in a 3" x 5" x 7" aluminum minibox. with one of our 1802 single-board computers. It drove a pair of inexpensive stepper motors as the left and right wheels, had bump switches in front and back, and two "fingers" that it could use to feel objects and push them around. It was programmed in 8TH (a "tiny" version of FORTH) to save memory. It worked out very well! [Dave Ruske posted a notible response to Lee as follows:] On Oct 4, 2009, at 7:59 PM, KB wrote: > You know to get kids intersted you just need to write a kid friendly > guide and something that looks like a ELF How about this: (The translation into HTML was, if I recall correctly, the work of Lee Hart.) That little book of Tom Pittman's and a Netronics ELF II got me started when I was in high school, and I'm still at it some 30 years later. Twice now I've written emulators for that machine, and Tom graciously allowed me to republish his "Short Course in Programming" in the help for TinyELF (as found online at ). I agree with Lee's previous posting that it was hard to expand the ELF beyond a certain point, but I'm not sure I ever found that to be a problem. The ELF was a first step to cement my interest in the machinery, and I graduated to tinkering with other microcomputer hardware and languages; there was no need to make the ELF be my first five or six steps. My nephew recently started his first year in college, and they're starting him out with Java programming and a course in digital logic. For me, programming sort of emerged naturally from understanding the sequences of bytes being fed to the microprocessor, then understanding the guts of Forth, and working up to higher level things like databases and memory management. For my nephew, I guess that connection will be made later. [Lee responds later: "Or never. ;-(" ] While that's a different way to learn, I'm not sure that it's incorrect in any sense. Times have changed, and jobs that demand assembly language understanding of a microprocessor have become a rarity. By the time he graduates, how many jobs will even demand skills in memory management? The C++ code that was all the rage at my day job a few years ago has given way to C# and Java, and even Objective-C has gained a garbage collector. Charles Moore is correct that we are working with increasingly complex technology, but I'd hasten to point out that we're also working at higher levels of abstraction. Just like I didn't need to think much at the transistor level with my ELF, the majority of today's students may know the electronics of the microprocessor only as something they once had to tinker with in college. We don't forge today's microprocessors transistor-by-transistor without the use of tools that hide the details, and people don't typically need to write database-backed web services in assembly language. (I acknowledge, of course, that there are exceptional jobs out there, and for those exceptions a deeper understanding is needed... but in general, the lower-level understanding of a system has become less of a job requirement than it once was, and I suspect curriculums have been adjusted in acknowledgement of that.) In the end, I guess there are many paths that will take a student to a successful career in computing. Certainly the interns I've met in the past few years have been very competent in the jobs they were hired to do, despite the obvious handicap of never having used toggle switches to bootstrap an OS. :) Dave Ruske [Lee Hart responds:] [Pittman's book] is a really excellent guide for learning to program the 1802. However, there are two improvements I would consider today: 1. Take it farther. Continue into using an assembler, structured programming, a higher level language (I think FORTH suits its style well). 2. Add more hardware. Show how to actually control something; a night light that gradually brightens as the room darkens, a clock/timer to control things based on time, an electric train control, etc. [As for the "need to make the ELF be my first five or six steps".] There's no "need"; but I think you would progress farther and faster if the path is clear. > ... in general, the lower-level > understanding of a system has become less of a job requirement than it > once was, and I suspect curriculums have been adjusted in > acknowledgement of that.) This debate has gone on in education for many years. Now that we have calculators, should we stop teaching manual addition, subtraction, multiplication, and division? Then the kid in the store can't tell that he's been overcharged. Now that we have spell-checkers, should we forget about spelling? Then we get peeple with atroshus speling. Given the prevalance of keyboards, should we stop teaching handwriting? Then they can't write without a computer! It makes people into clever monkeys. My view is that education is not about learning things by rote. It is about learning how to THINK. If you know the basic principles, you can come up with NEW solutions, that weren't in the book! I'm concerned that innovation ceases when there is no understanding of the basics. Someone who programs only in Java cannot imagine doing anything with "only" 256 megabyes of memory (let alone 256 bytes). With such people designing products, there would *be* no pocket calculators! To Charlie Moore's point, there's a book that more fully explains what he's worried about -- "Normal Accidents" by Charles Perrow. "[Normal Accidents is] a penetrating study of catastrophes and near catastrophes in several high-risk industries. Mr. Perrow... writes lucidly and makes it clear that 'normal' accidents are the inevitable consequences of excessive organizational complexity." -- John Pfeiffer, The New York Times. You *have* to teach the basics, to give students a solid foundation upon which to build higher knowledge. - Lee Hart [A self-described long-time lurker posted:] "I would like to know the easiest way to learn microprocessors without an emulator. In other words: what kind of hardware to use. ....Any suggestions?" [Lee Hart responds:] That's exactly what we've been discussing! :-) The problem most of us have is that we know so much today that it's hard to design something that people *without* our years of experience can use! We have to try to remember what it was like when we were young. That said... you could easily build one of the original Popular Electronics ELF computers, right from the original article (on the cosmacelf website), and it would work. Also, you can learn to program it by reading Tom Pittman's "A Short Course in Programming" (also on this website). There may be better ways to do it; but they aren't written yet. These are things that are available right now. We know they work, because that's how many of *us* learned! :-) [There was concurrent discussion of the serial based 1802 emulator mentioned in this thread. See the Web page "http://www.retrotechnology.com/memship/memship.html" for more information about that discussion. Then Lee introduced the following:] A couple years ago, in a burst of nostalgic creativity, I designed a few ELF versions under the loose heading of "membership cards". The idea was to come up with a bare bones basic ELF design that was very inexpensive and easy to build. But sadly, I too suffer from "engineer disease", and kept dreaming up different features and versions so that no purchasable product came of it. The comments here have inspired me to dust off what I think is the best of these designs, and actually order boards for it. This is the DEV2 design in the cosmacelf files section. It's a 2-board set, with the ELF itself on one small PC board, and a separate front panel board with the switches, lights, and a parallel port interface for plugging into a PC. ELF Membership board -------------------- - 3.5" x 2.15" (fits in an Altoids candy tin) - 1802 CPU (with adjustable RC clock oscillator) - one 28-pin memory socket (for standard 2k-32k byte RAMs or EPROMs) - 8-bit output port (OUT5 or OUT7) - 8-bit input port (IN5 or IN7) - expansion header with all I/O and 1802 signals - operates on 3v-6v at under 1ma (plus whatever the particular memory chip you use requires) ELF Front Panel board --------------------- - 3.5" x 2.15" (plugs onto the Membership board) - 9 individual LEDs (D0-D7, plus Q) - 11 subminiature toggle switches (D0-D7, Clear/Wait/Load/Run, and Memory Protect -- exactly like original ELF) - 25-pin DB25 connector to go to PC parallel port (allows PC to completely control the front panel; and thus load, examine, and run programs, and read/write to the I/O ports). If this sounds suitable for what you want, let me know. I have to order at least 5-6 boards to get a reasonable price, and would like to find some helpers to test them before committing to a larger order. [A few members responded that they'd like a board set; some said they had parts, from when Lee last talked about this project. Again, see the Web page "http://www.retrotechnology.com/memship/memship.html" for more information about the membership card and discussion of it.- Herb Johnson]