Saturday, December 6, 2014

Safe Family Computing

Kid-raising is tough work. Helping them to grow, discover talents and abilities, explore, gather skills, progress successfully through all the stages of childhood (and adulthood!)... to learn independence and good judgement, good work habits and a sense of who they are, and why and how to bless others.

In this new everything-is-connected-to-everything world, what can we do to keep them safe while they grow? I offer a few suggestions from experience, both mine and those of people I trust.
  1. Establish good communication channels, and keep them open. The home is an educational institution. The best teachers listen twice as much as they speak. Eat meals together without the TV on. Study from the "best books" together. Pray together. Set aside a special time to visit with each child and don't deviate. My favorites are tucking them into bed, and when they're too old for that, taking them to lunch or dinner.
  2. As physical walls protect from earthly elements, spiritual walls protect from that which can harms the spirit. Learning self mastery is essential, "walling off" danger zones with rules and tools is possible and necessary.
  3. Keep computer displays in plain sight and out of bedrooms. Is this not a no-brainer?
  4. Set and enforce time limits. Tools: TimesUpKidz!
  5. Practice regular oversight and review using Web accountability/Parental Internet control ($)CovenantEyes, OpenDNS  (Free) K9 Web Protection
Most attacks on our digital devices (and our souls) are wrapped up in alluring and shiny packages. So while these tools are useful to shield our young ones, OpenDNS also helps shield the entire home's Internet connection from attackers and rogue, infected devices.

Much more has been and could be said, but mentioning these few links is a good starting point for many family-centered homes. I welcome your experiences on the rules and tools that you (or your parents) have used.

Friday, July 13, 2012

Between a firewall and a hard place


Primer

A computer's firewall is a security program that limits communications with the world outside of the computer. Much as its namesake in a vehicle separates and protects the passenger compartment  from the  engine compartment's dangerous heat and moving parts, a computer's firewall aims to secure and separate it from dangerous communications that carry parasitic and destructive elements such as viruses, worms, and trojan horses. 


Many computer security products include a software firewall in addition to antivirus and other tools. Simply put, it consists of an engine to analyze communications against a list of rules.

The issue

A personal computer's landscape is expected to change over time. New programs and capabilities are added, updates to existing programs are installed, or perhaps new staff members are assigned. [By the way, not all capabilities, programs, or staff members are equal. There are insecure capabilities and programs, as well as demon programs and demon users. Some become so tightly integrated into the fabric of a business that they are simply endured. There's an old saying about it being better to deal with known problems than trading them for someone else's.]

New threats also emerge, so many that signature-based antivirus is becoming overwhelmed, and when that happens a computer's performance can suffer. Cloud-based antivirus can help, but that's another issue.

Whether a computer's landscape changes, or the Internet changes, security requirements change also, and this affects a firewall in very fundamental ways.

Logically, then, disabling a firewall increases the attack surface of a computer, opening it to potentially unwanted communications from outside. This can be dangerous to both a computer's health and its owner's financial health. Incoming probes against a computer typically search for unsecured services and software vulnerabilities through which they can either infect or manipulate. Unwanted outgoing communications can be a parasite sending off confidential data, requests for instructions or targets, or to bring in additional, destructive software. So limiting outgoing (egress) communications has some distinct benefits if a computer happens to become infected.


A good firewall can provide a way to safely adjust to an evolving operating environment. When a new communication source is noted, a firewall can either deny it outright, or better yet ask for additional input to make a new rule (dynamic ruleset). The goal is to determine whether a trust should be established. Since the computer isn't smart enough to determine trust, a pop-up appears on the display asking for direction from the smarter operator.

The challenge: Common Sense


And so you would expect that the operator would be able to answer a simple question about trust. People are familiar enough with day to day operation of their computers that they could answer the question properly more than 8 out of 10 times. But the rub is that many firewalls don't allow the operator to answer the questions they pose. I'm reminded of the child who holds hands to his ears and shouts "I can't hear you!" over and over. And my answer to that is "why on earth did you ask if you aren't going to allow an answer? Yes, I talk to computers. Fortunately few talk back, but again I digress. 


It is summarily ridiculous to require that users also run at elevated levels (administrator accounts) in order to answer the firewall software's prompting for guidance. How many times have we been told that administrator accounts are far less secure for day-to-day operations? Are you listening ESET? Kaspersky? Productivity also suffers, and users, managers and small business owners quickly assume that the firewall is simply too annoying for prime time, and that IT may not know its business too well. Nobody likes to take a beating, and so the firewall is turned off, or user accounts are promoted just to avoid the annoyance. Both are bad situations. I promise, it happens in IT departments all over the world. 


This failure to interact with a normal user may be a technical limitation, but it would seem to my college-educated brain that it doesn't make sense for security products to effectively compromise security in the quest to create a protected environment. Perhaps it's just the natural selection process... bad products suffer when their designers make bad choices. And then again, maybe some of the blame for this should be laid at the feet of another entity that will be probably out of the consumer operating system industry, sooner as opposed to later. 

Alternatives

Because the purpose of a firewall is to impose limits on network communications, alternatives exist that will limit the attack surface of a computer or network to a list of safe Internet neighborhoods and companions. The fact that other layers of security can replace the firewall is a boon to IT people. Proxy servers, web filters, and DNS services such as OpenDNS can serve this purpose. While I still recommend use of firewall products, this last service in particular has done much to help keep my family safe from unwanted trips to the Internet wasteland, and gets my firm nod of approval.

Thursday, June 28, 2012

IT Recommendations and Standards

As with any hired expert, advice provided by a computer professional should meet certain benchmarks, commonly referred to as "best practices" and "industry standards." These are based (to one degree or another) on various technological and economic factors, such as risk potentials, measured reliability, market health, purpose of use, business size and strength, and technology trends.

"Best practices" and industry standards both improve and change over time with the onward march of technology, and form the basis for operations planning, training, maintenance procedures, and repairs. As a basic example, connections to banks, online storefronts, and social networks require appropriate security protocols to protect identity and credentials in order to meet challenges posed by predators on the Internet. As the scope and nature of predatory practices change, security protocols and practices also change. Complexity of passwords and proofs of identity have both increased in the past few years.

Computers should also meet minimum standards in order to function reliably. Reliability is measured by the ability to readily start (boot), shut down, compute, maintain safe operating temperature, store data, display information, respond to operator input, communicate with networks and peripheral devices, and interact with the operating system (Windows, OS X, IOS, Linux, etc.) and productivity software.

Three key factors to usability and reliability are:

  • Physical health
  • Ability to run selected software
  • Security

Physical Health

The items most at risk of physical failure are fans and data storage drives, mostly because they are electromechanical devices having one or more electric motors that experience bearing wear. If you hear clattering or humming coming from a computer, chances are it's from a fan blade or motor. Storage disks also have fine tolerances between moving parts. Failures in power supply electronics is also common, and on-hand spares are recommended. Semiconductor failure in memory devices is also a common point of failure. Replacement of these parts is dependent on market availability, and as technology advances parts utilizing older technology are no longer produced and therefore become unavailable.

Ability to Run Software

A software purchase should receive careful scrutiny with regard to its capabilities and computing system requirements. Software must match the operating system versions and computer capabilities that it was designed for. In fact, software requirements typically dictate computer selection and configuration. It must peacefully coexist with other software, and not overwhelm the computer. Productivity software and operating systems are often co-dependent and age-dependent. Old software doesn't often get along well with new computers, and will both reduce productivity and increase support costs. In addition, many software systems created for business are designed to interact with other software. When one changes, the other(s) may have to change as well.

Security and Inter-connectivity

Computers requiring Internet access must meet additional criteria in order to remain functional and secure in an ever-changing Internet and market landscape. Security and online safety is, for the most part, measured by the ability to mitigate dangers posed by those on connected computer systems, which have the ability to probe for vulnerabilities and inject hostile or parasitic software that can take advantage of those weaknesses and provide unwanted access.

Summary

Computing devices, plans, and practices that are found outside of industry standards and "best practices" should be considered obsolete (or nearing obsolescence), and should be improved or replaced. For all of these reasons, today's computers have a normal useful life of between 3 and 7 years. Failure to budget for timely replacements will lead to unexpected failures and additional support expenses. Because of changing technology and markets, achieving backward compatibility with old software and networks with replacement computers will also likely cost more. In the end, replacing a computer approaching obsolescence may be less expensive, especially if major changes in technology have occurred.

An I.T. adviser can measure systems, practices and goals against both industry standards and "best practices," technology trends, and business plans, and can provide recommendations and planning that will meet goals and advance productivity and profitability.

Adherence to "best practices" and industry standards provides long term benefits for a disciplined person or company. Without a plan to move forward with technology, the usual result is moving backward financially.

Wednesday, February 29, 2012

The Road to Somewhere

My experience with computers began with a gift from my father. As he worked for Mountain Bell (yes, part of the former AT&T) in the billing department, he had the foresight to recognize the coming age of computers. The book was "The Analytical Machine" - and frankly I wasn't much interested at the time. Hardy Boys mysteries were more my style (I'd finish off a couple a day). As a matter of fact, I don't know if I ever finished that book.


Those who believe in destiny would probably place the blame there. As a sophomore in high school I found that the math department had two TTY (teletype) terminals in a partition of one of the classrooms. Both used slow acoustic modems -- the kind that Matthew Broderick used in "War Games" with that cool but clunky telephone handset coupler. Speeds of a few characters per second were typical, and communications delays of several minutes were common on the school district's time share computer.


One was a spiffy IBM model 1050 or 2741 with a selectric keyboard and wide-paper printer. The other (Teletype ARS-33, I believe) was something out of an early-1950's movie, complete with machine-gun like sounds, yellowish paper rolls and klunky round electro-mechanical keys. We learned to program in BASIC, and source code was saved on paper punch tape.


One thing led to another, and I ended up studying digital and solid-state electronics design while majoring in Electrical Engineering at Brigham Young University. Growing disillusioned and frustrated by a 4-year program that was quickly evolving into 6 years, and 1-credit classes that required about 12 hours of non-class work per week, in 1981 I left to experience what the world had to offer. For a few months I pushed a lawn mower for my brother-in-law, and then was hired as a clerk for a Las Vegas moving company. 


While the job market had at first seemed a more attractive option, west coast entry level opportunities didn't really pan out as hoped. I soon moved back to live with family in suburban Denver, and explored employment in many areas, including mobile DJ (my first business), 911 dispatcher, and irrigation systems repair dispatch. Then a great job appeared: engineering and post-production of educational video at a studio near Boulder, which was my last full-time employed position.


While there I shared a house with 3 other guys. Two were students at the University of Colorado, and one worked for IBM's production facility nearby (you see where this is going). While production was slow at the studio (which is another story entirely) I became well acquainted with an original dual-floppy IBM PC with a 1200bps modem and 384k of memory (a screaming machine at the time). Those were the days of DOS 2.x and long before Microsoft became a household word. After spending about 30 hours a week on the PC (I did say work was slow, right?), I was far more familiar with the "Personal Computer" than even its owner. Sorry Bob, hope you've caught up.


When the studio became insolvent I moved back to Las Vegas (family had migrated there in the meantime) and found work as a temporary clerical worker, which opened a few doors to people who needed assistance setting up their first PCs, along with small networks and custom databases. Networking led to several other clients, and seeing that computers were going to be an important part of my life I took out a credit line and purchased my first PC - an ITT Xtra 8086-based PC clone with 640K of RAM, a 20MB half-height hard drive, and a 13" amber-monochrome monitor. 


I was in business for myself, where I've been ever since.