Showing posts with label Web Development. Show all posts
Showing posts with label Web Development. Show all posts

Monday, January 29, 2007

Is Dial-Up Still Standard?

As web developers, we like to equip ourselves with the speediest computers, the fastest Internet connections, and the most standards-compliant browsers, but how is the rest of the world keeping up?

Albert Listy writes:

With all of the things you hear and see about Ajax these days I would think that dial-up should no longer be considered the standard. When you look at .NET you have post-backs and you take care of most of the page control on the server side and just serve up HTML (mostly) to the client. With Ajax you send more files and control substance to the client which take more bandwidth.

My web development question is should we still consider a “dial-up” connection the “standard” for our web design projects?First off, I should point out that some of the conclusions you’re drawing are a little off. The heavy use of post-backs in ASP.NET is actually a real source of pain for dial-up users, who are being forced to reload an entire page with almost every action they take. A well-designed AJAX application, meanwhile, can significantly reduce bandwidth usage by sending relatively small parcels of JavaScript code to the browser at page load, which then allow the browser to handle much of the user interaction without having to reload entire pages from the server.

At last check, more than a quarter of active Internet-connected users in the United States were still on dial-up, and predictions state that they won’t be making the move to the fat pipe anytime soon (due in part to the fact that local calls are free in North America).

Other countries seem to be having more success in fostering broadband adoption, so the answer to this question depends in part on your target audience.

Here at the Yank family cottage, where it’s dial-up or nothing, my sympathies lie with that stubborn minority. Unless your site can better fulfill its purpose by taking advantage of broadband (e.g. a video sharing site), I’d say you should still design with an eye to limiting bandwidth usage. Even your broadband users will thank you when your site loads in the blink of an eye.

Kevin Yank
Sitepoint Times…

Is Ajax the future of Desktop Software?

Since the emergence of Dynamic HTML circa 1997, pundits have predicted the death of desktop applications. Will AJAX prove to be the magic ingredient that makes these long-standing predictions come true?

From David McLeary of Cloud Ten Limited: “The use of AJAX has made it almost possible to replicate most operations of software that is available such as Microsoft Excel. Do you think the use of this is the future or are the limitations placed on it by, for example, browser security going to prevent it from reaching its full potential.”

Let’s stop for a moment to examine the benefits that AJAX-based web applications have to offer over their desktop counterparts:There is no software installation required, removing a barrier to entry for some users.

Users can access the application, along with their data and preferences, from any Internet-connected location with a modern desktop browser.The latter has had a significant impact on which AJAX applications have been successful in attracting users. AJAX has been successful in spaces like email, calendaring, mapping and photography, where mobile or multi-location access is clearly beneficial.

More office-oriented applications like spreadsheets will struggle to find a market for AJAX implementations, at least for now. Having ad hoc mobile access to these types of applications may become important to users somewhere down the line, but it isn’t yet. Consequently, users will tend to stick with traditional desktop software solutions, where the interface can be completely tailored to the needs of the application.

That said, I don’t mean to imply that traditional desktop software and AJAX-powered web applications are the only two options. There are many hybrid solutions that are attempting to blur the lines between the Web and the desktop, attempting to harness the benefits of each to capture the hearts and minds of mainstream users.

For example, Adobe Flex and OpenLaszlo add the requirement of an up-to-date Flash plug-in to the browser, but in return offer greater control over the user experience, and an escape from certain browser sandbox restrictions (such as local data storage).

Closer to the desktop, there is Java Web Start, that can download, install and launch a desktop application when the user clicks a link in a web page. Upcoming alternatives such as Adobe’s Apollo and Microsoft’s XAML will work in a similar fashion. It will be interesting to see if any of these platforms can achieve the ubiquity of the AJAX-capable web browser as a means of accessing applications away from the home/office desktop.

Simplicity Sells!

This is a good article I found on Sitepoint that was
written by Brendon Sinclair


I booked my wife some plane tickets online last week and it was a very frustrating experience. I would enter the date she wanted to travel, select a timeframe in which she wanted to fly, check the availability of a flight and, when the date I wanted was sold out, I’d have to hit the Back button and start all over again.

Each time, I had to input in every single bit of information again. I was so frustrated with the process after the fourth or fifth time that I was ready to leave the site and try the competition.
It was only then that I realized that I didn’t have to go back each time: my details were being saved below the fold, at the bottom of the page. But I didn’t see that the first few times I tried to use the booking service.


I’d love to know just how many people give up on that site because they become frustrated with the process, and don’t realize their details are being saved to the form. With an average air ticket sale being around $500, a 1% shift in conversion rates would add up to hundreds of thousands of dollars per month.

Testing every aspect of your business is essential. After all, you don’t want to spend thousands of dollars on ineffective advertisements, or waste the opportunities that each visitor to your site presents.

Test and measure. Test and measure. Just because huge companies don’t bother to do it doesn’t mean you shouldn’t!

Brendon Sinclairtribune @ sitepoint.com

Essential Programs For Your Virtual Toolbox

Essential Programs For Your Virtual Toolbox
By Kim Roach (c) 2006

As an online marketer and webmaster, there are a number of tools that you will want in your virtual toolbox. To get you started, I have scoured the net to find some of the best programs to help webmasters improve their sites, their rankings, and their productivity. Best of all, each one of these tools is free. We’ll start with one of my favorites.

Good Keywords

Good Keywords is a program that allows you to quickly and easily create an extensive list of targeted keywords for your website using Yahoo, Ask, and Overture. This frëe tool comes loaded with a link popularity meter, a keyword phrase builder, a misspelled word generator, and a web page explorer tool that will allow you to quickly see what keywords other sites are targeting. Once you have found your desired keywords, you can group them into keyword sets, which can then be copied into your clipboard. To find out more about this handy tool, go to http://www.goodkeywords.com.

BackLinks Master

It is a well-known fact that search engines use link popularity as one of their top ranking factors. However, quality is much more important than quantity when obtaining inbound links. In addition, many of the search engines place importance on the use of relevant anchor text. With a tool known as Backlinks Master, you can monitor your link popularity in Google, Yahoo, and MSN. This tool finds direct links, JavaScrípt links, and others. In addition, you will be shown the anchor text and link type that others have used to link to you. To start taking control of your link popularity, go to CleverStat.com.

SEOpen

SEOpen is a Firefox extension that provides numerous SEO tools at the clíck of a mouse. All of its features can be easily accessed by right-clicking on a web page.
Using this tool, you can examine your competitors:
Yahoo BacklinksPages in Yahoo IndexGoogle BacklinksGoogle CachePages in Google indexGoogle RelatedPageRank ReviewMSN BacklinksPages in MSN IndexAlexa OverviewAlexa TraffícAlexa RelatedAlexa Backlinks“Mass Chëck” multiple sources at onceConfirm DMOZ InclusionKeyword DensityPage Size CheckerHTML ValidatorServer Header ViewerWayback MachineReview robots.txt

Whois Info

To quickly and easily perform competitor analysis, visit http://seopen.com/firefox-extension/index.php.

Active Web Reader Customizer

RSS is becoming one of the best ways to increase your online exposure. With the upcoming release of Windows Vista and Internet Explorer 7, RSS usage is expected to rise significantly. However, RSS is also new enough that you can use this technology to create an effective viral marketing campaign.

You can do this by distributing your own RSS aggregator. If your RSS aggregator becomes popular, your brand could become very well known on the Internet. With the Active Web Reader Customizer, you can recommend and distribute an RSS reader that is preloaded with your feeds and web pages. By doing this, you are giving your visitor additional value and increasing your brand exposure at the same time. Start creating your very own customized RSS reader at http://www.deskshare.com/awrc.aspx.

RSS Wizard

We’ve talked about the wonderful benefits of promoting an RSS feed, but how do you actually create one. Fortunately, you don’t have to be a techie to create your very own RSS feed. All you need is a tool like RSS Wizard. This tool will automatically convert almost any web page into an RSS feed. With the RSS Wizard, you can create, edit and publish an unlimíted number of RSS channels. To start creating your own RSS feeds, go to http://www.extralabs.net/rss-wizard.htm.

FeedDigest

Not only can you create RSS feeds for your own website, but you can also incorporate other peoples’ RSS feeds into your site to increase your publishing power, deliver value to your readers, and ensure that your website is constantly up-to-date and changing. In fact, with a tool called FeedDigest, you can mix multiple feeds into a single feed to post on your website. By doing this, you will be able to deliver your visitors a unique mix of automatically updating content. You could also create a news dashboard to syndicate the latest news on any topic. The possibilities are unlimíted. You could create a feed that combined forum posts, news headlines, Ebay items, Digg posts, Flickr photos and even podcasts all rolled into one, constantly updating stream of related content. To start creating your own customized RSS feeds, go to http://www.feeddigest.com.

MailWasher

If you’re like most online marketers, you are probably receiving a flood of sp@m in your inbox. Fortunately, there is a frëe tool that will help remedy your sp@m problems. This software is known as MailWasher and it helps you eliminate your unwanted email, thus allowing you to have greater productivity. Best of all, this increased productivity ultimately leads to increased revenues. Start increasing your own productivity at http://oss.firetrust.com/home/.

Audacity

Audacity is an open source program that allows you to record live audio, edit audio files, cut, copy, slice and mix sounds together. Whether you are looking to record interviews, create your own podcasts, or edit your audio files, Audacity is a great solution. Look it over at http://audacity.sourceforge.net/.

CamStudio

CamStudio is a great piece of software for creating screen capture videos. This software records screen activity from the Windows desktop and then turns it into standard AVI movie files. In fact, CamStudio can then convert these AVI files into bandwidth-friendly Streaming Flash Videos (SWFs) with its built-in SWF Producer. These are great for creating info products, video tutorials, and software demonstrations. In terms of quality and price, there is simply no better solution for Screen Cam software. Try it out at http://www.camstudio.org.

About The Author

Kim Roach is a staff writer and editor for the SiteProNews and SEO-News newsletters. You can also find additional tips and news on webmaster and SEO topics by Kim at the SiteProNews blog. Kim’s email is: kim @ seo-news.com

Friday, November 10, 2006

Avoiding SQL Injections

Since it first saw success as a powerful web development platform, PHP has suffered from the ease of use that bred that success. Inexperienced developers can all too easily build applications that are vulnerable to attack, and one of the most common vulnerabilities is the SQL injection.

From Kees Kodde of Qrios Web Design: “In most security related articles about web development, the threat of SQL injections is mentioned, and there seem to be a lot of ways to defend against this. What is, in your opinion, the most simple and effective way to filter possible SQL injections out of user input?”

The biggest challenge of defending against SQL injection attacks is understanding them, so let’s start with a simple example in PHP. This script fragment determines the price of a product given its ID as submitted by the browser:

$id = $_POST['id'];$sql = "SELECT price FROM
products WHERE id = $id";$result = mysqli_query($db, $sql);$row =
mysqli_fetch_row($result);$price = $row[0];


The problem here, as in most scripts vulnerable to SQL injection attacks, is that an assumption has been made about a value that is being received from the browser. The code assumes that the ‘id’ value sent by the browser will be a number, and can be placed into a string to form an SQL query like this:

SELECT price FROM products WHERE id = 123


But what if the ‘id’ value contains a maliciously-crafted string instead? When the value is placed into the string, it could instead form a query like this:

SELECT price FROM products WHERE id = 123 OR price
<>


That’s an SQL injection. In this example, it will fool the script into fetching a price less than 10 (assuming there is another product with such a price in the database) instead of the actual price. In other cases, SQL injections can be used to bypass password checks when logging into a site, and in some rare cases even modify the data stored in the database.

In general, the solution to SQL injection attacks is to enforce every assumption you make about any value that you insert into an SQL query. You can either do this manually, or use a pre-built library to do it for you. The above example could be modified to force the ‘id’ value to be interpreted as an integer:

$id = (int) $_POST['id'];


For numbers like this, you can force the language to convert values to numbers. For strings to be included in SQL queries, you need to use tools like PHP’s mysqli_escape_string function to convert special characters like quotes into a form that will not interfere with the query’s operation.But relying on yourself and your fellow developers to remember to enforce these rules for all browser-submitted data is problematic. Instead, you should use some library that will do it for you.

PHP5.1’s PHP Data Objects (PDO) API allows you to place values into SQL queries safely, specifying the expected data type.

$stmt = $db->prepare('SELECT price FROM products
WHERE id = :id');$stmt->bindValue(':id', $_POST['id'],
PDO::PARAM_INT);


So to answer your question, the simplest way to defend against SQL injection attacks is to avoid building your own SQL queries, and instead to use an API like PDO that will do it for you, safely. Indeed, PHP is one of the few languages where building SQL queries by combining strings is a common practice, and I’d say the prevalence of SQL injection attacks on PHP-based applications can be largely attributed to this.

Thursday, October 5, 2006

2 New Sites Launched!

Cotton Rohrscheib, Partner and Co-Founder of Pleth, LLC, announced today that his firm had recently launched two new client projects.…

Holidays In the Rock
Brought to you by The Little Rock Conventions and Visitors Bureauwww.holidaysintherock.com

Holidays in the Rock was developed by Pleth, LLC and Strategic Partner, the Angela Rogers Group of Little Rock. Holidays in the Rock showcases the breathtaking foliage, harvest celebrations, haunted museums, historical home tours, cultural attractions, flavorful gourmet dining and more that is going on in Little Rock this holiday season! A treasure of festivals, live entertainment, exciting events and special promotions showcase just a few of the things brought to you from the Little Rock Convention and Visitors Bureau.

“Everything from where to have dinner to the best places to shop and stayover can be found on this website,” said Greg Smart, Founding Partner and Project Manager for Pleth, LLC. “The site is also very dynamic in that it will be evolving the closer we get to the Holiday Season. The site is also open to the public to submit events for inclusion on the widely publicized holiday calendar, those interested should contact the Angela Rogers Group or send an email to: brenda@theangelarogersgroup.com.” added Smart.

Re-Elect Valley for Mayor
Brought to you by The Committee to Re-Elect James Valley, Mayor
www.valleyformayor.com

Valley for Mayor is the official re-election website for James F. Valley, Helena – West Helena, Arkansas’ Incumbent Mayor and local attorney. Since Mr. Valley’s victory in the previous election he has worked very hard in establishing Arkansas’ newest city as a competitor again for commerce and industry. Mayor Valley was also pivotal in the drive to have Helena and West Helena consolidated.

“In addition to James being a long-time client, he has also been a good friend for several years.” said Rohrscheib. “James is a very intelligent person, and an extremely hard worker that knows how to rally people to work together for a common goal.”

While the Valley for Mayor website was launched this week, there are some components of the website that will be following in the weeks to come. “We have officially launched the website, but a lot of the functionality hasn’t been developed just yet. With Mayor Valley currently in office, the time that we get to spend w/ him is limited so there are items we are waiting on that will have to come later such as content for some sections of the website. Also, we are awaiting processor information for his campaign contributions module,”

“Currently the site isn’t barren though, we do have a large number of photographs available online from the past term that were provided by Mayor Valley’s Office, as well as a few family photos of Mayor Valley. The photos section will also see quite a few new photos added on a daily basis as we get closer to the election. A new section has also been added to the site this year that will allow voters and constituents the opportunity to create an account and interact with Mayor Valley about any issues they would like to discuss,” said Rohrscheib, who assists Mayor Valley in monitoring the site’s blog traffic.

In addition to launching these two new projects, the Pleth, LLC development team has been very busy as of late. Last Month, a major development project for Wal-Mart sponsored Hofi, Inc. Kids All-American Fishing was launched. The Team has continued to monitor the solution with the client to iron out any bugs that might exist with a new solution this detailed.

New Contracts Announced…

Stephen Smart, Founding Partner of Pleth, LLC also announced today that three new clients had signed Contracts with Pleth, LLC to begin work on web development projects. These new clients include: the New York Based, InternetLawFirm.com, the Oklahoma Association of Health Care Providers, and Hagan’s Auto, located in Morrilton, Arkansas with a new location opening very soon.

Tuesday, August 8, 2006

Google XML Sitemaps - The Basics

By Scott Van Achte, Senior SEO,
StepForth Placement Inc. (c) 2006

Google XML Sitemaps have been around for a while now and many webmasters are starting to become familiar with them. They can help you to achieve up to date indexing in Google, and, in a round about way, play a small roll in assisting with rankings. Sitemaps are not needed by everyone, but can be of significant use for many websites. This article will touch on the basics of what they are, who can use them, and how to implement them.

What is a Google XML Sitemap?

In short a Google XML Sitemap allows webmasters to submit a master list of all their site’s pages to Google for indexing. This information is stored in an XML file along with other relevant information where specified by the webmaster. It can be as simple as a list of URL’s belonging to the site, or can include, last modified date, update frequency, and priority. The purpose of this Sitemap is to have the most recent version of your URL’s indexed in Google at all times.

Who needs a Google XML Sitemap?

XML sitemaps can generally help any site needing to be indexed by Google; however, small sites may not see the need for this. For example, if you have a small 10 page website that seldom sees any of its pages updated and your entire site is already in Google’s index, the XML Sitemap is not necessarily going to help much. It is best used when trying to keep the latest versions of your pages current in Google. Large sites with an extensive list of URL’s will also benefit, especially if 100% of their pages are not appearing in the index. So a general rule of thumb, if you have either a dynamic or large site, Google XML Sitemaps just may benefit you.

Will using XML Sitemaps improve my Google Ranking?

In most cases this will not improve your rankings, however it can help. By having the most current version of your site in Google’s index, this can speed up your movement in the results pages. This is because if you make an update to a page for optimization purposes, Google’s index will have this page updated more quickly than without the XML sitemap. What this essentially means is that with more frequent spidering you can help influence what version of your site is in the index, and ultimately, help with rankings by decreasing response time.

How do you create the XML Sitemap?

If you have a very small site, or a lot of time on your hands you can create your XML sitemap manually, but for the vast majority of webmasters, automated tools are an absolute must. There are a number of available solutions for this. One of the simplest methods of creating XML sitemaps is through the use of VIGOS GSitemap. This is a frëe, easy to use tool that will help you create your XML sitemaps with ease.

There are also number of downloadable and online tools listed on Google’s site which cater to both beginners and seasoned professionals alike.

Submitting your XML Sitemap to Google is relatively straightforward. After the file has been created the first thing you want to do is upload the file to your server, preferably at the root level. Log into the Sitemap console using your Google account login. From hëre you can add a site to your account. Simply enter your top level domain where it says “Add Site” (see fig 1.0).

This will add the domain to your account and allow you to then submit the XML sitemap.Figure 1.0After this is done it will take you to a screen with the summary for this site. You will see a text link that says “Submit a Sitemap”. Clicking hëre will take you to a screen to enter the online location of the XML sitemap. (see fig 1.1 below). Clíck “Add Web Sitemap” and you are on your way.

Once this is complete you have the option of verifying your Sitemap. This can be done by placing a specific meta tag on your home page, or by uploading a blank html file with a file name provided by Google. Verification will allow you to access crawl stats, and other valuable information regarding your Google listing.

Implementing an XML Sitemap is generally straightforward and worth the effort. Taking the time to implement them is well worth it as there is no negative down side to this tool provided by Google. Every little thing adds up in terms of obtaining site rankings and frequent spidering by Google is certainly one of them.

About The AuthorScott Van Achte is the Senior SEO at StepForth Search Engine Placement. Since graduating from Camosun College several years ago, Scott has been working with StepForth Placement and has thoroughly enjoyed his position in the search engine industry. When he’s not busy working he can be found spending quality time with his wífe Lyndsay, or out on the golf course. Scott would be happy to answer any questíons, and can be reached at scott @ stepforth.com.

Wednesday, July 26, 2006

Search Engine Friendly Flash Web Site

By Scott Goodyear

Most SEOs and many web designers know that Flash based web sites are a challenge not only to get indexed but to even get the site or pages to rank well in the search engines. This article will explore a few of the challenges and provide a few tips that may come in handy if you are asked to work on a site or pages that include Flash based content.

First, consider how Flash is being used. Some web sites are nearly 100% Flash driven. These sites often have non-SEO traditional factors that help create popularity for them and in turn drive inbound links to the site through sources such as a national television/press campaign, a movie or DVD tie in, tremendous industry buzz including reviews, high profile news paper or magazine articles, or other factors that are not easily reproducible.

If your client has a site like this, consider the use of an HTML based landing page under the main URL and/or HTML container for the Flash content at a bare minimum. The object in this case is to at least have a title tag and meta content that can be indexed by a search engine. An example of an HTML container concept would be Warner video’s Gone with The Wind Flash page. Because it does use the term “official” in the title, meta tags, as well as in a brief description of the Flash page in a “no script” tag, this site actual ranks better for a search on official gone with the wind rather than simply gone with the wind.

As noted above, this site uses the ‘no script’ tag to describe the Flash based content. Others have pointed out that you can also use the ‘no embed’ tag as well as a ‘div id’ tag to add content about your Flash content. The goal behind these methods is to provide a bit of content for users who do not have Flash installed, have it blocked/turned off, as well as for search engines to index. This content should serve to represent exactly what is in the Flash file, not as a method to add extra content, keyword stuff, etc. Think of it much like a an alt tag on a normal web site. If I am displaying a gray car that has feature x, y, z in my Flash file, I should describe that… not go into a semi-unrelated dialog. A tool called ‘SWF2HTML’ is available through Adobe as part of a ‘Macromedia Flash Search Engine SDK’. This tool can be used to extract the text and links from a Flash file and output this into very basic HTML which can then be used to describe the Flash file. As this is much the same content that some search engines can extract from your Flash file it can help you to tweak the content within the Flash file too, just in case it is being extracted.

If you have a specific font/formatting style that you wish to use with certain portions of your site, consider using sIFR. Normally web designers will use a graphic to replace headline text or cascading style sheets (CSS) to set the font/style that should be used on a given web page or site. The problem with a graphic is that it can not be seen by a search engine, so you loose out on text that might otherwise help describe the page’s content. A problem with both graphics and CSS is that across different monitor/operating systems/browser combinations, the pages will not look the same at all. With sIFR, the same exact content is presented to both engines and site visitors with an improvement for site visitors as the sIFR content scales to the font that the Flash designer wishes to use and scales with the available space specified on the web page instead of relying on the browser/installed fonts to be interpreted through CSS. See this sIFR page for an example of the code in action.

In the most ideal of SEO circumstances, the entire site should not consist of Flash in and of itself. As has been discussed many times in previous articles, search engines rely on the text based content of a web site including on page content, title tags, image alt information, text based links, as well meta information to a much lesser degree, among other factors in order to index and sort sites for relevance. The ideal web site uses Flash to add to a visitor’s experience. There are certainly all Flash web sites that do quite well, but this is an exception and not a rule for the average web site. If you have a Flash web site, try turning cookies and JavaScript off as well using a program like Flash Block to view your site. Make sure that your site can work with out Flash from a visitor point of view. And that any information that would have been in the Flash content is available on the page.

Saturday, July 1, 2006

The Fundamentals of Search Engine Optimization

by Richard Drawhorn

The fundamental concepts behind Search Engine Optimization (SEO) are understood by most search engine marketers, but those new to the subject should find this article to be very useful. Informative articles on various aspects of SEO have been published here on MarketPosition.com over the years, and in this post I will summarize these concepts and provide links to relevant articles.

Keyword Research

The first step in SEO is to identify the search terms for which you would like your web site to rank well on search engines. We might believe that we know these terms already, but our intuition is often incorrect about how popular or competitive search keywords actually are in reality. People use all kinds of variations of phrases as they are searching for information on the internet. It’s important to identify these terms and use them in your site content exactly as people type them into search engines. To discover what these search terms are, a keyword research tool should be used. There are several free tools available, such as the Keyword Selector provided by Overture, but most of the robust keyword research tools are subscription based.

Web Site Optimization

You now have your well researched keyword list in hand, and are ready to use the keywords in your web site content. How should these search terms be integrated into your web pages? How often should the phrases be used, and in what sections of the pages? Those are excellent questions and the answers are not known exactly because they depend on the algorithms used by search engines. However, it is generally agreed upon that search engines look at several different sections of a page when evaluating its content:

Title tag
Heading tag
Meta Keyword and Meta Description tags
Text within the Body area
Link text and Link URL
ALT attribute for Image tags (the ALT tag may be less significant than other areas of the page)

Search engines look at the various sections of the document for repeating patterns of keywords or phrases. For this reason, it’s important to have a keyword density within a specific range. What should that keyword density be? Software tools like WebPosition’s Page Critic can help to answer that question. The Page Critic works by looking at the keyword density (and other statistics) of pages that are already ranking well on search engines. Since the search engines keep the details of how their algorithms work a secret, a reasonable strategy is to emulate pages that are well positioned.

Web Site Design

Aesthetics and user friendliness are important elements of web site design, but there are a number of other things to consider to ensure a web site will be as friendly to search engines as possible.

HTML Validation

It’s important to ensure that the HTML code that makes up a web page is correctly formatted. If there are errors in the code, then search engine spiders may have difficulty indexing the page’s content. Use a HTML Validator to check the formatting of HTML code, and read this article for more information on why this is important.

Site Map

It’s always a good idea to create a site map to make it easy for search engine spiders to index the site’s content. Link to every page on the web site that has relevant content, and place a link to the site map on the site’s home page. It may also help to sign up for the Google Sitemaps program to help ensure your content is indexed by Google.

Develop a Site Theme

One aspect of web site design that is often overlooked is theme development. If possible, organize your content so that particular themes are reinforced. Read Reinforcing Ideas and Improving Relevance to Gain Better Rankings for ideas to consider when organizing your site content.

Avoid Duplicate Content

It’s important to avoid duplicating content on the web site. Read Duplicate Content: How Does it Affect Your Rankings? for more information on how to avoid penalties associated with duplicate content.

Comply with Search Engines’ Terms of Service

There are several practices to avoid to stay on good terms with search engines. Techniques like cloaking, hidden text, or spamming, for example, violate search engines’ terms of service. If a site is found to be using these types of blackhat techniques, it will typically be removed from the search engine’s index. Read this article for more information about practices to avoid.

Build Link Popularity

The Link Popularity of a page is a term that refers to the number of other web sites that link to that page. Search engines typically consider how many other sites link to a page as a factor in determining that page’s ranking. The idea behind this is that if others link to a page, then they must consider that page’s content to be valuable in some way. However, all links are not weighted equally and it’s therefore important to try to encourage high quality web sites within your own theme area to link to your site. For a good overview of these concepts, read Link Popularity Considerations, and for some ideas about how to start building links read A Review of Link Building Strategies.

Monitor Performance

Once your optimized web site is online, you’ll want to monitor its performance on the search engines. If the site is brand new, it should of course be submitted to the search engines, or perhaps resubmitted if necessary. Monitor the site’s positions on search engines for keywords of interest to identify areas where the site is performing well and areas that can be improved. An excellent tool designed for this is WebPosition’s Reporter feature. It creates formatted reports featuring graphs of positions over time, as well as useful parameters like the Keyword Visibility Index.

A web site that is positioned well in search engines should start receiving a significant number of visitors. To monitor traffic and other useful web site statistics, a web analytics solution such as WebTrends is recommended. Read Measuring Web Site Statistics as Part of Your SEO Strategy for more information about the benefits of web analytics.

Conclusion

The art of SEO is a set of skills that can be learned and implemented by anybody who manages a web site. However, proper optimization and maintenance of the site requires time and effort, and the fundamental elements of SEO discussed above must be put into practice. Those who find they do not have the time or desire to implement their own SEO program, read Outsourcing a SEO Program for some advice on how to best seek out professionals that can help.

Tuesday, June 6, 2006

Pleth, LLC Joins SWDN

PLETH, LLC is pleased to announce that they have joined the SWsoft Developer Network (SWDN). The SWDN is a resource and community center behind SWsoft’s OPEN FUSION Initiative. SWSOFT manufactures PLESK Control Panel Software that is currently used by PLETH, LLC hosting clients. The SWDN was created specifically for hosting service providers. SWDN provides API documentation and SDK’s needed for in-house customizations and integration with SWsoft software.


Some of the many benefits of SWDN:


  • Development Licenses: for all SWsoft products.

  • Software Developer Kits (SDKs) including API specs, & tech manuals.

  • Product Betas: early access to products before the general public.
  • Forums and Support: to interact and share information with SWsoft engineers

The PLETH, LLC Management Team met with Representatives from SWSOFT, Inc. this year during HostingCon 2006 in Las Vegas.

Monday, June 5, 2006

Launch: Team Caliber Project

Pleth, LLC an Arkansas Based technology firm announced last week that they were preparing to launch a new e-commerce enabled project for Team Caliber, Inc. of Charlotte, North Carolina, a Division of Roush Racing.

“The new project is just getting underway,” said Cotton Rohrscheib, Chief Developer and Partner. “They have put a ton of thought into providing their wholesale customers with this tool and we want to work with them in any way to bring it to fruition.”


The new project consists of various levels of authentication and security built on top of Pleth’s already award-winning e-commerce solution. Only registered and approved wholesale customers are allowed to login and view the 2006 product line. Orders are placed online and transmitted to Team Caliber by email and their staff takes it from there.

“We have taken a lot of things into consideration in putting this product together, basically rebuilding our existing e-commerce solution and making it do things that it normally wouldn’t; changing outputs and the way it displays products and handles orders, right down to the way it generates pick tickets and packing slips,” said Greg Smart, Project Manager for Pleth, LLC.

Team Caliber is the one of the top manufacturers of die-cast cars, fan memorabilia, and clothing for NASCAR.

Thursday, June 1, 2006

PCI Compliance & Credit Card Acceptance

I guess that Credit Card Fraud, Chargebacks, etc. finally is getting the best of Credit Card Companies… We learned while we were out in Vegas that PCI compliance would soon become a requirement from any online vendor that is accepting credit cards online by both MasterCard and Visa. I have to say that I am not surprised and sort of glad to see it. It will help to weed through the fly-by-nights that are out there…

What does PCI compliance do? Well, basically it is peace of mind to a shopper that their information is not going to be compromised. Keep in mind that within the last 6 months, the federal government and a branch of the armed forces have both had their information compromised and sensitive information about members of the military and the government were leaked to the general public. In a nutshell, if an online business is a fly-by-night business and do not have their website installed properly on a secure server w/ ssl, pci compliance, etc. they are putting their customers information at risk from being hacked.

At PLETH we have always strived to have the best intrusion detection methods in place at all times for our clients. In fact, we have some big-time clients that will do some heavy sales numbers this year. In our efforts we have learned that you just can’t be on top of everything at all times but you really need to be! With this in mind, we have contracted out with the top Security Auditing Firm in the United States, ScanAlert.

ScanAlert’s signature Hacker-Safe Certification can be found on over 55% of the top 400 e-Commerce websites in the world! We are bringing them to our clients and sparing no expense in doing so. In fact, with our new partnership arrangement that was formed at our meeting in Las Vegas, it is expected that our clients will receive a larger discount than firms that work directly with ScanAlert!

Pleth will soon be launching a co-branded website with ScanAlert that promotes the Hacker Safe Certification program that will ensure that websites meet the criteria for the following:
  • HIPAA Compliance
  • PCI Compliance
Both of these certifications will be in the news a lot in the very near future…

ScanAlert’s services monitors a websites vulnerabilities 24hrs a day, 7 days a week and watches for vulnerabilities, port scanning, and network fingerprinting services that are patent pending.
HackerSafe certification is currently being sought out by the United States Government, FBI, CIA, Homeland Security, and the Marine Corps. We feel that if providing our clients with peace of mind is our number one priority. In doing so, no other security auditing firm in the United States could be brought in as a partner other than ScanAlert, they are simply the best!


If you have questions regarding PCI compliance please contact a member of our team to discuss your immediate needs. Also, stay tuned to our blog for important announcement regarding the co-branded PLETH / ScanAlert Website that will be available for one stop certification shopping.

Thanks,
Cotton Rohrscheib, Partner
PLETH Networks, LLC