Search Results

Search found 24291 results on 972 pages for 'site ripper'.

Page 16/972 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Advice on software infrastructure for a FLOSS bounty site

    - by michaeljt
    I am planning to set up a simple web site where people can offer bounties for work on FLOSS projects. Unfortunately I have no experience at web development (I am a C/C++ developer), so I was hoping someone might be able to suggest out-of-the-box packages (preferably Debian ones) I could use to build the site from. My idea of how the site would work is to keep things as simple as possible. The person proposing a bounty would enter a description with relevant links (particularly to a bugtracker entry with the project the work is to be done on, where the real discussion and work would take place) and information and place an initial contribution. Other people would be able to add (donate, not pledge) contributions, but any discussion would take place on the project's bugtracker. I am also planning to run a mailing list rather than a forum (at least initially), so that is not a requirement. Paypal seems to me to be the handiest payment mechanism. So overall what I need is probably a simple interface with Paypal integration and a simple database backend. I hope this is the right place for my question, if not I would be grateful for pointers to somewhere better. And of course, this is purely about the technical side, though I am more than happy to discuss other aspects of the project elsewhere.

    Read the article

  • New site not appearing in index after change of address, no feedback from google webmaster tools

    - by Duffy
    Our change of address seems to not be taking effect. Here's the story so far: We're a web company and our product is called The New Hive. Our site used to be at thenewhive.com, but we decided to switch to newhive.com (drop the "the", it's cleaner). So the timeline of what I've tried, starting on July 29th: used 301 redirects for all pages (e.g. thenewhive.com/tag/art = newhive.com/tag/art) At this point we noticed that we had disappeared from search results when searching "The New Hive", the front page used to be all links to our site plus a couple news articles about the company. So on August 5th I: verified new domain in webmaster tools (old domain was already verified) submitted a change of address request on August 5th with Webmaster Tools / Configuration / Change of Address Then after another week, on August 13th I did this: Went to Webmaster Tools / Health / Fetch as google fetched our homepage and a couple sub pages, all successfully clicked "Submit to Index" for homepage As of today (August 23rd) we're still not showing up in the index. We're getting no warnings or feedback of any kind from the dashboard so I'm inclined to think something's broken with the dashboard rather than that something's wrong with our site from an SEO perspective. From the dashboard: No new messages or recent critical issues. Crawl Errors: No data available. From Health - Index Status: Total indexed 0 Ever crawled 42,490 Not selected 12 Blocked by robots 0 I'm really at a loss here, any help would be appreciated.

    Read the article

  • Moving from a static site to a CMS with new URLs and meta-data for pages

    - by Chris J
    Hi I am in the process of rebuilding a site from static pages to a CMS which will be using mod_rewrite to generate new page URLs. In this process our marketing people and myself have decided to tidy up the descriptions, keywords and titles. Eg: a page which who's URL is currently "website-name/about_us.html" and has a title of "website-name - something not quite page specific" will change to "website-name/about-us/" and title: "about us - website-name" and may have a few keywords and the description changed. Our goal with updating the meta data is to improve our page rankings and try to keep in line with some best practices for SEO. Though our current page rankings are quite good in many aspects, there is room for improvement. All of the pages will also have content changes (like rearranging heading tags, new menu on all pages, new content in footer, extra pieces of dynamic content relating to other pages). In this new site process I plan to use 301 redirects for all the old URLs pointing to the new URLs. My question is what can I expect to happen to the page rankings in Google, in the sort term and long term? Will this be like kicking off a new site which will have to build up trust over time or will the original page rankings have affect?

    Read the article

  • SEO: Getting site to show in location-specific searches

    - by willvv
    I'm really new to this SEO world and I've been reading a lot to try and figure it out. We have a site moodbond.com that allows users to browse/create events anywhere. And we fill it with content from the main cities in the US. We would like it to show for searches for things like "events in san francisco" or "what to do in new york", however, since the site is not really location-specific, I'm not really sure where to begin. I've been thinking a couple of things, maybe you can help me decide if these would be a good way to start or if I should try something different. 1- Allow something like location-specific urls (e.g. moodbond.com/browse/san-francisco) could just show the main page centered in San Francisco. 2- Change the headers/title of the page so it adapts automatically to the city being browsed (and change this dynamically as the user changes the location of the map). 3- Add internal links to different locations (e.g. add a link at the footer of the page that says "Events in Seattle" that makes the site load events in that city. (this would probably depend on implementing #1). What do you guys think? will any of these really help or should I look for a different approach? any advice is welcome. Thanks

    Read the article

  • Drupal site Instant Messaging [migrated]

    - by pthurmond
    I am trying to find a module or a standalone solution that I can turn into a module that will allow me to have an instant messaging system like Facebook does on a Drupal site that I am working on. I have never setup a chat system before. My particular requirements are rather stringent. It needs to be a solution where we host the chatting server (if one is needed separate from the website itself). It must use the site's login state (can't use an external system at all, that means no GTalk, Yahoo IM, or AIM). It also must be able to handle up to 1,000 users at any given time. I have looked through the Drupal community and I tried the DXMPP module, but it requires Jquery UI 1.8 and that doesn't work with all of the other things that my site uses (such as Homebox). We do have a Jabber server already setup and ready to go. Does anyone have any thoughts or options here? Thanks! EDIT: We are using Drupal 6.

    Read the article

  • How To Discover RSS Feeds for a given site.

    - by ktolis
    The quest is, given a site url (say http://stackoverflow.com/ ) to return the list of all the feeds available on the site. Methods acceptable: a) use a 3rd party service (google?, yahoo?, ...) programmatically b) using a crawler/spider (and some tips on how to configure the spider to return the rss/xml feeds only) c) programmatically using c/c++/php (any language/library) The task here is not to get the feeds contained on the page returned by the url but ALL the feeds that are available on the server at any depth... in any cases please provide a simple usage example.

    Read the article

  • SharePoint Content and Site Editing Tips

    - by Bil Simser
    A few content management and site editing tips for power users on this bacon flavoured unicorn morning. The theme here is keep it clean!Write "friendly" email addressesRemember it's human beings reading your content. So seeing something like "If you have questions please send an email to [email protected]" breaks up the readiblity. Instead just do the simple steps of writing the content in plain English and going back, highlighting the name and insert a link (note: you might have to prefix the link with mailto:[email protected]). It makes for a friendlier looking page and hides the ugliness that are sometimes in email addresses.Use friendly column and list namesThis is a big pet peeve of mine. When you first create a column or list with spaces the internal name is changed. The display name might be "My Amazing List of Animals with Large Testicles" but the internal (and link) name becomes "My_x00x20_Amazing_x00x20_List_x00x20_of_x00x20_Animals_x00x20_with_x00x20_Large_x00x20_Testicles". What's worse is if you create a publishing page named "This Website is Fueled By a Dolphin's Spleen". Not only is it incorrect grammar, but the apostrophe wreaks havoc on both the internal name for the list (with lots of crazy hex codes) as well as the hyperlink (where everything is uuencoded). Instead create the list with a distinct and compact name then go back and change it to whatever you want. The end result is a better formed name that you can both script and access in code easier.Keep your Views CleanWhen you add a column to a list or create a new list the default is to add it to the default view. Do everyone a favour and don't check this box! The default view of a list should be something similar to the Title field and nothing else. Keep it clean. If you want to set a defalt view that's different, go back and create one with all the fields and filtering and sorting columns you want and set it as default. It's a good idea to keep the original AllItems.aspx (note the lack of space in the filename!) easy and unfiltered. It's also a good idea to keep your column count down in views. Don't let every column be added by default and don't add every column just because you can. Create separate views for distinct responsibilities and try to keep the number of columns down to a single screen to prevent horizontal scrolling.Simple NavigationThe Quick Launch is a great tool for navigating around your site but don't use the default of adding all lists to it. Uncheck that box and keep navigation simple. Create custom groupings that make sense so if you don't have a site with "Documents and Lists" but "Reports and Notices" makes more sense then do it. Also hide internal lists from the Quick Launch. For example, if most users don't need to see all the lookup tables you might have on a site don't show them. You can use audience filtering on the Quick Launch if you want to hide admin items from non-admin users so consider that as an option.Enjoy!

    Read the article

  • Apache virtual host for drupal test site

    - by bsreekanth
    Hello, I am a programmer, trying to launch my first website.. through different helpful posts in sf and others, I setup an account with Linode and set up a slice (Debian, Apache, ..etc). I have a Drupal site under development, and like to have a test site in the Linode server as well. Now, I like to have a site setup with the following requirement. What is the best way to setup and protect the test site along with the actual (production) site?. Is virtual host is the answer? To protect the test site, is .htaccess authentication sufficient to prevent access from public and robots? I also modifying the theme, database contents etc, so having two sites under one drupal installation may not be good idea . what do u suggest? thanks in advance. bsreekanth.

    Read the article

  • "Server not found" for live version of site

    - by user1491819
    I can access my local dev site on my local pc, eg: http://mysite But I cannot access the live site, even though it works fine on other pc's: http://www.mysite.com The live site gives the error in Firefox: Server Not Found. Pinging www.mysite.com gives the error:"Ping request could not find host www.mysite.com" hosts file: 127.0.0.1 mysite I changed the hosts file to the following and rebooted: 127.0.0.1 mysitedev I'm running on XP, and have cleared the DNS cache using: ipconfig /flushdns I have verified the live site is up using: http://www.isup.me/ and the site loads fine using my phone. What could be preventing my local pc from accessing the live site?

    Read the article

  • Configure IIS to rewrite IP Address to Site Name

    - by Bath Man
    So i've started my first web site from home, and I'm trying to get it up and running and google crawlable and the like, but I can't seem to figure out how to have my site name returned in the address bar instead of my IP address. I've purchased a domain name for my site on Godaddy and then set it to redirect to my site. When you type in the domain name, you get redirected to http://0.0.0.0/default.aspx (not my real IP obviously), and that stays in the user's address bar. In order to fix that temporarily, I've set up masking on Go Daddy which keeps the URL in the address bar, but just shows my website in a frame. This is fine for users visiting the site, however any kind of automated robot such as GoogleBot cannot discover my content because of the frame. I've looked into ISAPI filters and server-site-rewriting, and the like... but I just can't quite figure out how to do what I need it to do. Any simple suggestions or links would be appreciated.

    Read the article

  • WSS 3.0 to SharePoint 2010: Tips for delaying the Visual Upgrade

    - by Kelly Jones
    My most recent project has been to migrate a bunch of sites from WSS 3.0 (SharePoint 2007) to SharePoint Server 2010.  The users are currently working with WSS 3.0 and Office 2003, so the new ribbon based UI in 2010 will be completely new.  My client wants to avoid the new SharePoint 2010 look and feel until they’ve had time to train their users, so we’ve been testing the upgrades by keeping them with the 2007 user interface. Permission to perform the Visual Upgrade One of the first things we noticed was the default permissions for who was allowed to switch the UI from 2007 to 2010.  By default, site collection administrators and site owners can do this.  Since we wanted to more tightly control the timing of the new UI, I added a few lines to the PowerShell script that we are using to perform the migration.  This script creates the web application, sets the User Policy, and then does a Mount-SPDatabase to attach the old 2007 content database to the 2010 farm.  I added the following steps after the Mount-SPDatabase step: #Remove the visual upgrade option for site owners # it remains for Site Collection administrators foreach ($sc in $WebApp.Sites){ foreach ($web in $sc.AllWebs){ #Visual Upgrade permissions for the site/subsite (web) $web.UIversionConfigurationEnabled = $false; $web.Update(); } } These script steps loop through each Site Collection in a particular web application ($WebApp) and then it loops through each subsite ($web) in the Site Collection ($sc) and disables the Site Owner’s permission to perform the Visual Upgrade. This is equivalent to going to the Site Collection administrator settings page –> Visual Upgrade and selecting “Hide Visual Upgrade”. Since only IT people have Site Collection administrator privileges, this will allow IT to control the timing of the new 2010 UI rollout. Newly created subsites Our next issue was brought to our attention by SharePoint Joel’s blog post last week (http://www.sharepointjoel.com/Lists/Posts/Post.aspx?ID=524 ).  In it, he lists some updates about the 2010 upgrade, and his fourth point was one that I hadn’t seen yet: 4. If a 2007 upgraded site has not been visually upgraded, the sites created underneath it will look like 2010 sites – While this is something I’ve been aware of, I think many don’t realize how this impacts common look and feel for master pages, and how it impacts good navigation and UI. As well depending on your patch level you may see hanging behavior in the list picker. The site and list creation Silverlight control in Internet Explorer is looking for resources that don’t exist in the galleries in the 2007 site, and hence it continues to spin and spin and eventually time out. The work around is to upgrade to SP1, or use Chrome or Firefox which won’t attempt to render the Silverlight control. When the root site collection is a 2007 site and has it’s set of galleries and the children are 2010 sites there is some strange behavior linked to the way that the galleries work and pull from the parent. Our production SharePoint 2010 Farm has SP1 installed, as well as the December 2011 Cumulative Update, so I think the “hanging behavior” he mentions won’t affect us. However, since we want to control the roll out of the UI, we are concerned that new subsites will have the 2010 look and feel, no matter what the parent site has. Ok, time to dust off my developer skills. I first looked into using feature stapling, but I couldn’t get that to work (although I’m pretty sure I had everything wired up correctly).  Then I stumbled upon SharePoint 2010’s web events – a great way to handle this. Using Visual Studio 2010, I created a new SharePoint project and added a Web Event Receiver: In the Event Receiver class, I used the WebProvisioned method to check if the parent site is a 2007 site (UIVersion = 3), and if so, then set the newly created site to 2007:   /// <summary> /// A site was provisioned. /// </summary> public override void WebProvisioned(SPWebEventProperties properties) { base.WebProvisioned(properties);   try { SPWeb curweb = properties.Web;   if (curweb.ParentWeb != null) {   //check if the parent website has the 2007 look and feel if (curweb.ParentWeb.UIVersion == 3) { //since parent site has 2007 look and feel // we'll apply that look and feel to the current web curweb.UIVersion = 3; curweb.Update(); } } } catch (Exception) { //TODO: Add logging for errors } }   This event is part of a Feature that is scoped to the Site Level (Site Collection).  I added a couple of lines to my migration PowerShell script to activate the Feature for any site collections that we migrate. Plan Going Forward The plan going forward is to perform the visual upgrade after the users for a particular site collection have gone through 2010 training. If we need to do several site collections at once, we’ll use a PowerShell script to loop through each site collection to update the sites to 2010.  If it’s just one or two, we’ll be using the “Update All Sites” button on the Visual Upgrade page for Site Collection Administrators. The custom code for newly created sites won’t need to be changed, since it relies on the UI version of the parent site.  If the parent is 2010, then the new site will look 2010.

    Read the article

  • Can google “see” this custom javascript code which displays links from an external site to mine

    - by webmasters
    I have a javascript code on my site who displays links from another site. This is what I have on my source before: <script language="JavaScript" type="text/javascript">showLink(1);</script> This is what I have copied from my source after the page has loaded: <script language="JavaScript" type="text/javascript">showLink(1);</script><a rel="nofollow" target="_blank" class="anc" href="http://x5.external_site.net/sc/out.php?s=5483&amp;o=http%3A%2F%2Fwww.bluetooth.com">Bluetooth Devices</a> Can google see this link?

    Read the article

  • .htaccess: Redirect Hotlink Flash --> Site with embed Flash

    - by user5571
    I have some .php sites that embeds .swf files. These .swf files are now linked to by some other guys. And I don't want them to simply open the SWF, I want them to force being redirect to the page where the flash is embed. Data: Site: www.example.com/1 (www.example.com/2, www.example.com/3 and so on) Flash: www.example.com/flash/flash_NUMBER.swf So for www.example.com/1: Site: www.example.com/1 Flash: www.example.com/flash/flash_1.swf I now want to redirect the user who types "www.example.com/flash/flash_1.swf" into his URL to be redirect to www.example.com/1. The Problem I have that the flash needs to be still accesseable via www.example.com/1 <-- I don't get that working (the Flash is embed into that page). The tool I would like to use for this is the .htaccess & RewriteRule. I hope someone can help me out.

    Read the article

  • How to create Office365 SharePoint site using SharePoint2010 template

    - by ybbest
    Recently, I worked with a client that has office 365 upgraded to SharePoint 2013.But they still like to create the SharePoint site using the old SharePoint2010 template, if you like to know how , here are the steps: 1. Go to your Office 365 portal https://portal.microsoftonline.com/admin/default.aspx and then go to the SharePoint admin page. 2. Next, click settings page. 3. Change the Global experience Version Settings. 4. Finally, you will be able to create SharePoint site using 2010 template.

    Read the article

  • .htaccess: Redirect Hotlink Flash --> Site with embed Flash

    - by user5571
    Hello, I have some .php sites that embeds .swf files. These .swf files are now linked to by some other guys. And I don't want them to simply open the SWF, I want them to force being redirect to the page where the flash is embed. Data: Site: www.example.com/1 (www.example.com/2, www.example.com/3 and so on) Flash: www.example.com/flash/flash_NUMBER.swf So for www.example.com/1: Site: www.example.com/1 Flash: www.example.com/flash/flash_1.swf I now want to redirect the user who types "www.example.com/flash/flash_1.swf" into his URL to be redirect to www.example.com/1. The Problem I have that the flash needs to be still accesseable via www.example.com/1 <-- I don't get that working (the Flash is embed into that page). The tool I would like to use for this is the .htaccess & RewriteRule. I hope someone can help me out.

    Read the article

  • Google I/O 2010 - Optimize your site with Page Speed

    Google I/O 2010 - Optimize your site with Page Speed Google I/O 2010 - Optimize every bit of your site serving and web pages with Page Speed Tech Talks Richard Rabbat, Bryan McQuade Page Speed is an open-source Firefox/Firebug Add-on. Webmasters and web developers can use Page Speed to evaluate the performance of their web pages and to get suggestions on how to improve them. Learn about the latest rules of web development we've added, updated optimizations, go over a new refreshed UI, see how to collect data through beacons to track progress over time, cut and paste fixes, and how to work with 3rd party libraries more effectively, including Google Analytics. For all I/O 2010 sessions, please go to code.google.com/events/io/2010/sessions.html From: GoogleDevelopers Views: 6 0 ratings Time: 47:15 More in Science & Technology

    Read the article

  • Site Review: MortgageCalculator.org - Forms Evaluation

    This site allows users to enter basic loan information into a form and when the user clicks the submit button the information is used to calculate a loan summary which includes: monthly payment, total interest paid, and the last payment date. This site uses server side validation and replaces any value not within a normal range with the calculator default for the form field. In addition, they also use server side code to calculate the items on the loan summary which is then displayed to the user. I personally think that by adding client side validation, it would improve the users experience because it would ensure that the data being submitted is within an acceptable norm and if the data entered was not within this range then it would allow the user to adjust the data.

    Read the article

  • Google I/O 2010 - SEO site advice from the experts

    Google I/O 2010 - SEO site advice from the experts Google I/O 2010 - SEO site advice from the experts Tech Talks Matt Cutts, Greg Grothaus, Evan Roseman A perfect opportunity to get your website reviewed by the experts in the Google Search Quality team. Attendees can get concrete search engine optimization (SEO) feedback on their own sites. We'll also answer real-life questions that affect developers when it comes to optimizing their websites for search. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 308 12 ratings Time: 01:00:38 More in Science & Technology

    Read the article

  • Understanding Ajax crawling of search site

    - by vacuum
    I have a couple of questions about Ajax crawling of site, which is kind of search engine itself. The base article explains the mechanism of making AJAX application crawlable. All this stuff with HTML-snapshots is clear and easy to implement, but I cant understand where will Google bot will get "the crawler finds a pretty AJAX URL"( ie www.example.com/ajax.html#key=value) to work with. First thing, that came on mind - is breadcrumb. In sitemap we can specify pages with breadcrumb on it. so bot will go to these pages and get HTML-snapshots from here. But I'm sure, there are exists other ways to give bot this "pretty AJAX URL". In our case, we have simple search site, where user enters keyword, presses "Find", js execute Ajax request, receives JSON reponce and fill page with results(without any refresh of course). In this case - how to make google bot crawle all the presults in addition to sitemap? Is there some example of solution, described in article above?

    Read the article

  • Recommended requirements when outsourcing xhtml/css site building?

    - by András Szepesházi
    I'm considering outsourcing a part of our web application development project for freelancers, namely the site building part. What I mean by site building is the process of creating the xhtml/css template files, with dummy content, from a psd file (or any other graphical layout file). The resulting xhtml/css files will be used by our developers as templates for cms based page rendering. The cms in this case is Drupal, but that might not be of much relevance. I'm looking for a good set of requirements, that will result in good quality xhtml/css code, complying with today's standards leaves little to the freelancer developer's imagination in terms of what I need I'm thinking about requirements like: Valid XHTML 1.0 Transitional document type, validated by validator.w3.org Identical rendering in all modern browsers (FF, Chrome, Safari, Opera, IE7-8) and also in IE6 All opening and closing block-level elements should be properly commented, referencing the functional part of the user interface they belong to (menu, toolbar, content, etc) No inline CSS definitions And so on. How would you organize a list like that? What requirements would you add?

    Read the article

  • Why Does the Same Site in Different SERPs Contain and Not Contain Google Sitelinks

    - by frank13
    Is there a way to get sitelinks on a Google SERP when searching on a site's name vs. the sites' web address? Example, if you search "twin city kings", the first result is the website for Twin City Kings without sitelinks. But if you search "twincitykings.com", the first result is the website for Twin City Kings with sitelinks. Is it possible to get sitelinks on both SERPs? Thanks for helping or clarifying. Note: this question does not pertain to "how to get a sitelink". It pertains to why does the same site come up in different SERPs with and without Sitelinks.

    Read the article

  • SEO optimization for AJAX site and dynamic HTML canvas

    - by Christian Benincasa
    I have a site that uses AJAX to query the Last.fm database and then dynamically draws a graph of the results on an HTML canvas. In the search function, I have a command that sets window.location.hash to the search parameters. I also have a function that checks if a hash was provided in the url and if so, generates the page. For example, http://www.thenlistento.com/#!/led+zeppelin will automatically navigate to a search page for Led Zeppelin. My question is, how do optimize this set up for SEO? Can it be done at all? I've taken a look at Google Webmaster Docs and read over the hashbang protocol, but I'm not totally sure how to apply it to my situation..or even if I can at all. Any help/suggestions would be greatly appreciated. Link to the site: http://www.thenlistento.com

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >