Getting Savvy About Computer Security

Becoming computer savvy about security does not mean hanging out at MIT. All it takes is assessing a computer’s needs and matching those with a good solution. What can be surprising is that making a computer secure might not even cost anything more than time.

Knowing the operating system of the computer is the very first step. Windows, for instance, is in a transition. Many people are still using Windows 7, others are using 360, and shortly, there will be the new Microsoft platform. Apple’s Macintosh has its own operating system and there are multiple variations of Linux being produced in the open source world. All these operating systems also come in a 32 bit or a 64-bit version, and it is essential to make this note this as well. Otherwise, it is impossible to know which version of a security suite is needed.

Data security is getting global tractionAntivirus software is often bundled with a firewall and other programs to provide an all in one solution for the end user. This is software dedicated to the duty of monitoring a computer against the possibility of a virus being installed. Viruses have a few different forms, but they gained their name because they are designed to copy themselves and be passed along to another computer. Once inside a computer system, a virus may potentially steal or destroy data, make a laptop inoperable, or undermine system performance.

Most computers today are bundled with an antivirus suite, and the user needs only to activate it. This packaged solution is most often given on a trial basis requiring a fee when the trial period expires. However, home users should know that many antivirus providers offer free versions for non-commercial use. AVG, Avast, and Avira all make free solutions for Windows. iAntiVirus is a company dedicated to the needs of Mac users, and it too provides a free non-commercial version. Avast also offers packages Linux users, but make sure to check with the Linux distributor for which package is needed before install.

Computer virus technology has not remained stagnant. In their evolution, new terms have been coined to describe their activity. By necessity, software to meet this challenge has been created to deal with this evolution.

In essence, this category describes software that hijacks a browser and makes it impossible to use the internet the way the user wishes. Clicking on a link may result in landing on a web page that has nothing to do with user intent.

Building a User-Friendly Website

Building a user-friendly website is relatively easy as long as you always remember that the site is there for the user, not you, and consider what they want to see rather than what you want to show off.

The first part of building your site should involve planning your content. Why does your website exist, what is its purpose? What information and services will it provide? When this is clear to you, you should decide who your target audience is. Is it a site for everyone or specialists in a particular field? Is it for young or old people? Is it for people who live in one specific area? Is it for people who are computer savvy or not? All of these factors and much more will affect how you can best build your website. But remember, your target audience will not be the only people who visit your website. It must be usable by everyone else as well.

Identifying your target marketKnow Your Target

You need to plan your content in a way that is suitable for both your target audience and the rest of the world. If you are presenting technical information to tech-savvy users, you will need to write technical content that probably won’t make sense to a non-tech user. To help the non-tech user, you could add explanations of the main ideas of your content as well as a glossary to help them make sense of the technical terms. If your site is for teenagers, you should present your content in a way that is familiar to them, but you could add a section to explain to parents what your site does and what you are doing to keep their children safe. If your site is only useful to people who live in a particular city, say so, so that people from other places don’t waste time there. Use your better judgment to make sure that everyone who visits your site will find it useful.

Once you have decided what content needs to be on your site, you should start to think about how it needs to be divided up, to form the structure of your site. Will there be just a few pages or a big collection? Can you split your content into logical sections? There are no rules here; you have to decide how best to display your content based on the content itself and who is going to be looking at it. Try to create a structure that will make sense to everybody if possible.

Navigation

When your content is structured and divided into sensible pages or sections, the structure of your main navigation should become evident to you. You need to link to your homepage, contact page and all other significant pages or sections from every page on your website. The navigation menu must be consistent across all pages of your website. If your site content is divided into sections, each section will probably require its sub menu. Try to design your navigation so that users can get to other pages as quickly as possible without giving them too many choices in each menu. Choose your words carefully for the navigation links, make it clear to your users where each link will take them.

Your home page should clearly show the user what your site is about, how it will help them, and where they can go to find what they need. Your homepage is not a good place to explain your company policies, show off your new logo or cram as much information as possible. It should have just enough information to tell the user what’s in it for them. The same applies to the main pages of subsections of your site if you have them. Keep it simple, and show the user how to get to what they want to know about.

So far, I haven’t mentioned anything about layout, design, browsers, or coding. That is because your users (unless they are designers and coders and your site is about designing or coding) don’t think about those things at all. They only think about themselves, and what they can get out of visiting your website. The information or services you provide them are all they are interested in. When you are planning your site, this is all you need to be involved in as well.

User Interface

Once you have worked out what content needs to go on what page, how the pages will link to each other, the main structure and content of your website, you can begin to think about how to lay this information out on the screen. Users want to know what the page is about, how it will benefit them, how they got there, and where they can go next. All of this information should be as close to the top of the page as possible. This will help them to decide whether to read the rest of the page or go somewhere else. They don’t want to see your logo, your motto or a picture of your cat. Of course, it is probably important to you that they see your logo and your moto. If you want to show them a picture of your cat, you might want to go back to the beginning and ask yourself why you are building a website, to begin with.

So, the top part of your page should contain your navigation, the page title, possibly a brief introduction to the content on the page. You should make sure that these are prominent and that your visitor doesn’t have to search the page to see them. You can use the remaining space to put your logo and whatever else you think should be there. Remember that some users will have small screens. Test your page in a small window and make sure that the information your visitors want is still visible without having to scroll.

User experience should be first priorityEasy on the Eyes

Design the rest of your page around what is already in place. Make sure the colors you use are easy on the eye and have sufficient contrast for people with black and white monitors (yes they do still exist, and if your user has one they should still be able to use your website), colorblind users, and people with poor eyesight. Nobody wants to see lime green text on luminous orange, even if you think it looks cool. Don’t use flashing, blinking or scrolling content because this is annoying and it can make it hard to read. Stick to conventional layouts that users expect to see, don’t confuse them by trying something too different. If you have to explain how your site works then, it’s too complicated, and you need to start again.

When designing the appearance of your website, be aware that it will look different on different computers and screen sizes. Try to make something that looks good for everyone. It doesn’t have to look identical on every different computer, but it should look nice, clear and easy to use for everyone. Steer clear of large images that take a long time to load or patterns behind the text that make it hard to read. Don’t use an image if the simple text will do the job.

When you have designed your content and the look and feel of your website, you will be ready to begin writing the code. If you don’t know how to do this, it is far better to pass the project on to someone who does. There are several ways to build a website without having to write your code, but they all have limitations and can leave you with poor code which could cause problems for your users when they attempt to visit your site. If you are going to write the code yourself, there are a lot of websites out there that can help you get it right. If you’re not sure about something, look it up.

Stylesheet

Your website code should be written well and should meet specific standards. Firstly, you should write valid markup (HTML or XHTML) to define the structure of each page. If you need to use a scripting language to generate dynamic content, you still need to make sure that the markup it outputs is correct. You should not try to use markup to control how your content looks on the page. It should merely define what parts of your content do what job within the document. Your markup should be well-formed, semantically correct and valid.

When your markup is finished, you should build a stylesheet in CSS to tell the browser how your content should be displayed to the user. This is where you begin to create the look and feel of your website. Your stylesheet should also be valid and well-formed. Again, if you don’t know what you are doing, get someone to do it for you, or get help online. Try the sites I mentioned above.

While you are writing the code for your website, you should take into account the fact that not all of your visitors will be using the same operating system and browser to access your website. You should test your website in different browsers and make sure that it displays well. You should also try it out on different screen sizes and at different screen resolutions. Bear in mind that some people who visit your site may not be using a modern browser. They could be using an old version of a browser, a text-only browser, a screen reader, a mobile device, an internet TV, and so on. There are several things that you can and should do to make your website more usable for these visitors as well as those using modern browsers.

Accessibility

This part of creating a user-friendly website is known as accessibility. Several rules and guidelines should be followed to make your site accessible. If you don’t know what you’re doing, get help. If you take these rules into account while you are building your website, it is easy to create an accessible, usable site. If you put together poor code and then try to fix it, it can be difficult and stressful.

Once your website is built and ready to go, don’t assume that you’ve finished. Ask your visitors how they easy they find it to use your site. If there’s something that they find confusing, change it for them. If there’s something they can’t find, move it or add more links to it. If something is missing that they expect to see, add it for them. If they have trouble with anything, your site needs to change to make it easier for them. It may not end up how you wanted it to be, but your site exists for your visitors not for you, and you always need to remember that.

Bad Tech Habits to Quit in 2018

With 2018 promising to provide leaps and bounds in the technological field, many current practices should get the boot. Here is a list of the top habits that the computer-savvy individual should attempt to quit in this new year.

Too much smartphone use means less personal interactionsSocial Networking

Although often touted as the easiest and most efficient way to communicate, sites like Facebook & Twitter are all addicting their users to online socializing. The downside of this is the decrease in social skills that our society used to take for granted. These sites also prove to be a giant time suck. In fact, I terminated my facebook account, and with all the spare time I had, I can do other productive things. All the time modern society wastes can be better used for productive purposes, and we can all take the first steps to end this grotesque waste by terminating your social networking accounts.

Online Gaming

In the same vein as social networking, online gaming is often responsible for taking away a lot more of one’s time than would be desired. Thousands of hours of time are wasted in excess online gaming that has gone beyond entertainment and has entered the realm of obsession. Online games also tend to serve as a substitute for a normal social life, which can make it hard for individuals to meet other people without a virtual sword in their hand.

Just like smoking and fatty foods, many technological addictions should be kicked out of our system, and there is no better time than the present. We have entered into a new year, and it is an impeccable time to refine our lives and create a new set of online behaviors that will redefine what it means to spend time online. We should start becoming more productive with technology, and not allow ourselves to become slaves to the virtual world.

Social Media Trends to Watch Out for 2018

You know when people say it is terrible to follow fashion trends? Trend watching may not apply to fashion, but it sure works for online and social media marketing. The reason for this is online algorithms change all the time and what works now may not work next month. So, it is then crucial for you to keep in terms with trends, so you don’t get left behind.

ocial media influencers took the stage in 2017The year 2017 has taught us a lot of lessons about social media marketing, and one of these is the importance of visuals. Before, texts were sufficient to advertise your brand. These days, though, compelling videos and photos are essential to make people take notice of your company. The new generation relies on visuals more than texts, but there is also an increasing trend of previous generations opting to watch than to read.

Another lesson we’ve learned in 2017 is the importance of social media influencers. This year has said their semi-goodbyes to traditional advertising where company-driven messages are live-streamed for consumers to see. It was observed that influencers play a more crucial role since they are real people whom social media users get to interact with on a daily basis. Therefore, what they talk about are from their own experiences, and whether this experience is authentic now lies in the judgment of the post viewer.

What social media trends should we watch out for 2018? Here are some of what you must prepare for:

1. The even greater importance of social media influencers

We mentioned that social media influencers have become popular in 2017; however, this trend is expected to rise in 2018. It is because since companies have only started exploring on this group of marketers, there will be more room to grow this coming year. Other platforms will also be tested including Instagram and Snapchat stories. So, if you are looking into leveraging your brand, you want to consider partnering up with a social media influencer who fits the principles of your brand.

Not surprisingly, even big companies such as Rolex and Philip Stein are now using social media influencers to market their brands further.

Live streaming breaks the barriers of time and convenience2. The rise of live streaming for marketing

This year, we’ve seen high-quality videos and photos being published for Manhattan & New York SEO marketing purposes. This 2018, we expect to see the rise of live streaming to further market the brand. The reason for this is that most people now want to look at what’s behind the curtains, or what goes behind the scenes. It is an excellent way to better engage with your audience since they get to get a glimpse of how you operate as a company. It is going to be staged for sure, but the key is to make it as close to reality as possible.

Trend watching is crucial if you want to market your brand online. You can follow other companies and try to figure out their strategies so you can see their strong and weak points. It is good to expose yourself to as many examples as possible so you can better understand the direction you are taking. And remember, what used to work for 2017 may not work for 2018. So plan your social media marketing strategies well ahead of time.

Passive Measurement and Analysis (PMA) Project Data Tools

The current PMA measurement design, which utilizes a collection of independent monitors, includes more than 21 passive monitors. These monitors take regular measurements on a variety of networks, ranging from FDDI to OC12 and from ATM to POS. Traces are collected, post-processed, encoded (anonymized), and then made publicly available on the NLANR PMA website for use by other network researchers, as well as students. The passive header trace data provides the means to study workload profiles for a number of measurement points in high speed environments, soon up to OC48, with work under way for OC192 packet tracing capability.

Monitoring a variety of networksWe have developed a number of tools to handle this data, all of which are available on the website. The tools can be classified as analysis or selection tools.

Analysis tools are usually programs which take our traces, or associated data, and either generate the data in a different format, or produce a human readable presentation of statistical data gathered from the traces.

Selection tools are predominantly Web based and are intended to assist researchers in finding suitable traces for their analysis needs.

To improve performance, all of the analysis tools are being rewritten in C (previously available only as Perl scripts). Three scripts have been rewritten successfully as of March 2002, with more underway. The initial performance increases have been quite promising. The tsh2ta program is now executing as much as 10 to 30 times faster than the equivalent Perl script (using equivalent hardware): 30 times faster on large traces, small traces are now 15- to 20-fold faster, with most runs experiencing a 10-fold increase. The reimplementation of ta2sum has resulted in a three- to four-fold increase in performance; while modest when compared to the gain in efficiency for tsh2ta, this is still significant.

Performance Comparison: tsh2ta.pl vs. tsh2ta (C Implementation)

Motivation: The need for improvement

Every three hours a trace is taken at each of 21+ monitors. The current methodology gives us a ninety-second trace, which is then followed by three hours during which our post-processing, analysis, and retrieval may be done.

The ratio of three hours processing time to ninety seconds of trace time may seem like plenty, though as link usage continues to grow. And as the links themselves grew faster, we begin to run into time constraints which may lead to corrupted analyses, corrupted traces, or outright loss of a trace.

On top of our current problems, we are constantly receiving requests for more long traces, on the order of hours instead of minutes. There have also toyed with the possibility for some real-time analyses or continuous traces. Our current tools and methods are not even close to being able to run in real time for any substantial amount of traffic.

Strategy: Reimplementation

Reimplementing updatesAll analyses were previously done in Perl. While Perl is a fine language for prototyping and cross-platform compatibility, it’s efficiency as an interpreted language is less than optimal. Since we do not wish to change our methodology in any significant way, we must improve the performance of our existing methodologies. This can be accomplished either by enhancing the runtime of the analyses on current resources or by improving the resources themselves.

Our resources are many, varying in performance, and cost-prohibitive to update. Also, in resources deployed in the future, even given the improvements in quality and speed of the architectures deployed, we will still face analysis bottlenecks. An improvement in the software seems justified in this situation. I chose to rewrite all of the tools used on passive monitors for analyses, starting with the most time-consuming ones. The prominent place to start was the tsh2ta Perl script, which is a preliminary analysis required for all other analyses and has been known to fail due to resource exhaustion.

Implementation problems

One of the luxuries afforded by Perl is its implicit hash structure. Reimplementing a program which relies on hash’s efficiency is very difficult in C. Especially when adequate, free, standard, and efficient abstract hashtables do not seem to exist for C. After trying out several alternatives, including glib, publib, the possibility of writing my own, and a chained hash implementation by Jeff Brown.

After wading through the API for glib, I discounted it as being too complex for our purposes. Publib looked more promising, but consultations with Jeff lead me away from publib due to implementation issues and the possibility of performance limitations with the datasets I planned to be working with. Eventually, I decided to use Jeff’s chash and asked him to tailor it in a few aspects so that it would suit my application.

My last problem with regards to hash tables was in iterating over them. I found that deleting records from a hash while iterating through hash elements can cause bizarre and unexpected results. I managed a workaround for this problem using a global list.

Initial Performance Findings

My first test was on a huge trace taken at AIX which had caused the Perl script tsh2ta implementation to lock up. The initial results were very promising; the C implementation finished in about 10 minutes where the Perl script on the same machine and load was killed prematurely after approximately 300 minutes.