Search Results

Search found 8284 results on 332 pages for 'trusted sites'.

Page 10/332 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Week in Geek: Malware-Infected Web Sites Doubled Since Last Year

    - by Asian Angel
    This week we learned how to get spelling autocorrect across all applications on a Windows system, “diagnose DSL hang ups, extract media files from PowerPoint presentations, & restrict IE to a single website”, customize the Ubuntu bootloader screen, get smartphone-style word suggestion on Windows systems, learned what character encodings are and how they differ, and more. Photo by Profound Whatever.HTG Explains: What Are Character Encodings and How Do They Differ?How To Make Disposable Sleeves for Your In-Ear MonitorsMacs Don’t Make You Creative! So Why Do Artists Really Love Apple?

    Read the article

  • Why do people crawl sites without downloading pictures?

    - by Michael
    Let me show you what I mean: IP Pages Hits Bandwidth 85.xx.xx.xxx 236 236 735.00 KB 195.xx.xxx.xx 164 164 533.74 KB 95.xxx.xxx.xxx 90 90 293.47 KB It's very clear that these person are crawling my site with bots. There's no way that you could visit my site and use <1MB bandwidth. You might say that there's the possibility that they could be browsing the site using some browser or plug-in that does not download images, js/css files, etc., but the simple fact of the matter is that there are not 90-236 pages that are linked from the home page (outside of WP files), even if you visited every page twice. I could understand if these people were crawling the site for pictures, but once again, the bandwidth indicates that this isn't what is happening. Why, then, would they crawl the site to simply view the HTML/txt/js/etc. files? The only thing that I can come up with is that they are scanning for outdated versions of WordPress, SQL injection vulnerabilities, etc., which makes me inclined to outright ban the IPs, but I'm curious, is it possible that this person is a legitimate user, or at the very least, not intending to be harmful?

    Read the article

  • WouldISurviveANuke Assesses Your Distance From Nuclear War Strike Sites

    - by Jason Fitzpatrick
    WouldISurviveANuke is a morbid Google Maps mashup that plots out the effective radius of nuclear weapons on major metropolitan areas, your distance from them, and your chances of survival. Visit the site, plug in your zipcode, and set the parameters (how big of a nuclear weapon and how large the nearest target city needs to be) to find out if you’re in the blast radius. We plugged in a downtown address in Detroit, MI. The verdict? Neither we nor the cockroaches will be coming out alive. If you plug in a location far enough away from the direct blast radius you’ll also get a quality of life report that spells out the effects of a local nuclear strike. As far as startling anti-nuclear proliferation arguments go, WouldISurviveANuke is an effective and interactive demonstration. Hit up the link below to try it out. WouldISurviveANuke [via Y! Tech] How to Run Android Apps on Your Desktop the Easy Way HTG Explains: Do You Really Need to Defrag Your PC? Use Amazon’s Barcode Scanner to Easily Buy Anything from Your Phone

    Read the article

  • Azure Web Sites FTP credentials

    - by Bertrand Le Roy
    A quick tip for all you new enthusiastic users of the amazing new Azure. I struggled for a few minutes finding this, so I thought I’d share. The Azure dashboard doesn’t seem to give easy access to your FTP credentials, and they are not the login and password you use everywhere else. What Azure does give you though is a Publish Profile that you can download: This is a plain XML file that should look something like this: <publishData> <publishProfile profileName="nameofyoursite - Web Deploy" publishMethod="MSDeploy" publishUrl="waws-prod-blu-001.publish.azurewebsites.windows.net:443" msdeploySite="nameofyoursite" userName="$NameOfYourSite" userPWD="sOmeCrYPTicL00kIngStr1nG" destinationAppUrl="http://nameofyoursite.azurewebsites.net" SQLServerDBConnectionString="" mySQLDBConnectionString="" hostingProviderForumLink="" controlPanelLink="http://windows.azure.com"> <databases/> </publishProfile> <publishProfile profileName="nameofyoursite - FTP" publishMethod="FTP" publishUrl="ftp://waws-prod-blu-001.ftp.azurewebsites.windows.net/site/wwwroot" ftpPassiveMode="True" userName="nameofyoursite\$nameofyoursite" userPWD="sOmeCrYPTicL00kIngStr1nG" destinationAppUrl="http://nameofyoursite.azurewebsites.net" SQLServerDBConnectionString="" mySQLDBConnectionString="" hostingProviderForumLink="" controlPanelLink="http://windows.azure.com"> <databases/> </publishProfile> </publishData> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } I’ve highlighted the FTP server name, user name and password. This is what you need to use in Filezilla or whatever you use to access your site remotely. Notice how the password looks encrypted. Well, it’s not really encrypted in fact. This is your password in clear text. It’s just crypto-random gibberish, which is the best kind of password. UPDATE: About 2 minutes after I posted that, David Ebbo mentioned to me on Twitter that if you've configured publishing credentials (for Git typically) those will work too. Don't forget to include the full user name though, which should be of the form nameofthesite\username. The password is the one you defined. That’s it. Enjoy.

    Read the article

  • Best way to track multiple sites with Google Analytics

    - by stevether
    I currently have 63 websites (and counting) that I'm tracking on one Google Analytics account, and I'm starting to realize... this is becoming a bit cumbersome. What's the best way to collect traffic data in bulk? Are there other resources out there that are better suited for this task? Does Google offer a bulk option for this kind of thing? Would it be better to make separate analytics accounts? I'm just wondering if anyone else has had found a better solution that manually setting up all these accounts/setting up the tracking codes etc, when it comes to large scale management.

    Read the article

  • Digital "Post It" notes for organizing content of sites/pages

    - by Alex
    We're restructuring our old intranet into a new one and are going through each site to find content and use our new standard structure/look-and-feel. Do you recommend a tool where you can do "digital Post-It" notes? It would provide a way to type some items on a "card" and be able to move it around and organize it quickly. Also, if you know of tools in general for this kind of task, please advise. Thank you.

    Read the article

  • Midori displays mobile sites.

    - by Tigull
    I've installed Midori to work in an Elementary mockup. Midori works just fine, but, on a few domains, it opens mobile websites instead of the regular pages. A of now, this happens with gmail.com, facebook.com, paginegialle.it, and others; yet other websites which do have a mobile version, as gazzetta.it, show the regular version. I couldn't find a solution through settings and I didn't find any mention of this problem on forums or else, so any help would be greatly appreciated.

    Read the article

  • Rating Sites Development with ASPDOTNET

    Rating people, their skills, their abilities, their look, etc. are very old activity in human being history. It goes date back in 19th century that people used such rating system. The best use of rat... [Author: Jessica Woodson - Computers and Internet - May 10, 2010]

    Read the article

  • Java plugin is not working on some sites

    - by Jay
    I have icedtea6-plugin installed on my Ubuntu box. The weird thing is, when I am browsing the internet, certain website are able to use the Java and some websites are not. For instance, I am able to log in to my bank at nordea.dk. They use a Java applet to authenticate the user. But when I try to use keepvid.com, it says "Loading Java Applet". And then after a minute or two it says "Error: Please click here to download Java. If you already have Java, please restart your browser and try again." The thing is I've restarted my computer, closed and opened my browser(chromium) and none of it seems to help. Could someone please point me in the right direction to solve this problem? Thanks.

    Read the article

  • SEO For Ecommerce Sites

    Ecommerce is a fancy name for just about any business that can be conducted via the internet or other computer systems. The global internet explosion has changed the face of the ecommerce industry for good, increasing both potential profits and the ferocity of competition across virtually every consumer market.

    Read the article

  • How to Change How Long Internet Explorer Keeps a List of Sites You Have Visited

    - by Taylor Gibb
    There is a handy feature in most modern browsers that allows you to go back and see what pages you have visited on a particular day. But what if you don’t want your browser to keep track of your browsing history? Here’s how to disable it. How To Play DVDs on Windows 8 6 Start Menu Replacements for Windows 8 What Is the Purpose of the “Do Not Cover This Hole” Hole on Hard Drives?

    Read the article

  • Recommended online training sites on software development

    - by liortal
    I am looking for an online training site that provides courses on software development topics. Subjects that are needed for my work are .NET, general object oriented principles, design patterns, unit testing, continuous integration but not limited to these in particular. I have tried to use Pluralsight which was nice, however i am not sure the style of videos only is sufficient (for my at least). Are there any other training companies that provide online courses in other formats that you found useful (regarding .NET but not limited only to it). Thanks

    Read the article

  • Réseau social : Affichez votre badge Developpez associé à votre profil sur vos sites, Blogs, réseaux

    Chers membres du club, Aujourd'hui nous vous proposons de créer un badge associé à votre profil d'utilisateur du forum. Vous pourrez simplement diffuser votre profil sur votre site internet, votre blog ou sur tout autre site. Voilà de quoi créer votre propre identité Developpez et de la faire connaître Ce badge existe en 2 versions : * une version "web", utilisable en copier-collant un petit bout de code html à l'endroit souhaité * une version "image", utilisable partout sur internet Voici le rendu de la version image pour mon propre profil : [IMG]http://www.developpez.com/ws/badgeimg?user=29897[/IMG] Notr...

    Read the article

  • SEO and external sites that serve responsive images (like Re-SRC)

    - by Baumr
    Re-SRC is a tool that allows you to automatically serve responsive images for your website from their cloud servers. It delivers a new image file each time the browser window (viewport) is resized. To use it in your HTML when linking to an image, you would do the following: <img src="http://app.resrc.it//www.your-domain.com/img/img001.jpg"/> Some more background for SEO considerations: As an example, looking at their demo page's code, the src of the Arc de Triomphe photo — when the browser window is resized to be at a tablet-width — shows this particular file at it's widest. It is found under the following URL: http://app4-uk.resrc.it/s=w560,pd1/ro=h//www.resrc.it/img/demo/demo-image-1.jpg If the viewport is increased to desktop-width, then a smaller image is served in line with the design; see this URL: http://app4-uk.resrc.it/s=w320,pd1/ro=h//www.resrc.it/img/demo/demo-image-1.jpg If I change the viewport to be about half-way between those two, then the image's URL is: http://app4-uk.resrc.it/s=w240,pd1/ro=h//www.resrc.it/img/demo/demo-image-1.jpg In other words, I found that there is a separate file for every 10-pixel increment of the image width. Very cool for saving bandwidth on mobile devices and service responsive/retina images on others, but... Here are two problems I see for SEO: The img on your site, part of your semantic markup, will not be hosted on your site at all, or even a server you control. Any links to these images will pass on "link juice" to Re-SRC's site instead. You are serving a vast array of different image files to different people — some may link to one, others to another size. Then there's the question of what different search engine crawlers will see. Also: There seems to be no fallback option if their servers are down. Do you see any other concerns? Or, perhaps, do you not see those as concerns?

    Read the article

  • What books/references are recommended on the subject of planning and developing efficient web sites [closed]

    - by Shakil
    Once I visited a site containing videos; a well-known web developer creating a site from scratch via planning(paper, software), management, designing then development. I bookmarked the site but unable to find it now. My question is : How to do web-development effectively? What books or videos are recommended ???(I tried google but unable to find useful books or videos). I want to learn how people does it. Can you share resources(books, videos, links) about this... Thanks in advance.. Note: I created a job site for my university project. It gave me huge pain. Thats why I want to learn efficient way. I know html, css, javascript, jquery, php[learning(mvc and framework not yet completed)], phpmyadmin.

    Read the article

  • Restricting access to sites

    - by Paul
    I'm having some problems configuring my local proxy server so that it would restrict access to certain websites. The proxy server I'm using is Squid; I edited its configuration file found in /etc/squid/squid.conf to include the following: acl wikipedia dstdomain .wikipedia.org http_access deny wikipedia I tried to redirect elinks to use Squid. According to Squid's config file, it listens to port 3128, so in /etc/elinks/elinks.conf I added the following: set protocol.http.proxy.host = "localhost:3128" I also restarted Squid with sudo /etc/init.d/squid restart, but I can still access the banned websites using Elinks. What did I do wrong?

    Read the article

  • Multilingual sites and Google search results, using sub-folders for language

    - by AWinter
    About three months ago we added an English version of our, previously Japanese only, site under the subfolder /en/ we've tried to follow the sometimes incomplete best practices laid out by Google by adding alternate tags to all pages that are currently translated. The top page for instance has the following meta tags for language. <link rel="canonical" href="/"> <link rel="alternate" hreflang="ja" href="/"> <link rel="alternate" hreflang="en" href="/en/"> While the English main page under /en/ has <link rel="canonical" href="/en/"> <link rel="alternate" hreflang="ja" href="/"> <link rel="alternate" hreflang="en" href="/en/"> Alternate languages are setup in the sitemap. (as per Google's recommendations) It seems however that Google absolutely refuses to show the English top page in results when the user is using English at google.com if you search you'll, as of this post, get the Japanese description and a title that Google has apparently invented instead of the title and description in the meta-tags for the /en/ index page. Does anyone have any experience with subfolders actually working to affect search results? What are the best practices for ensuring that the correct language version of my website is displayed through Google and other search engines? And how long will it take before the new language version becomes prominent in search engine results?

    Read the article

  • Increasing the Ranking of Sites

    In today's markets the most efficient way to advertise business is through the internet, which has continued to have increasing number of users; therefore more and more business are having websites that show all that the organization has to offer the customers. However, there are some businesses that still don't know the importance of campaign for their business using the SEO. There are numerous companies that offer business free site analysis whereby the business owner is given a step by step guide on what would work best with the site that the business has.

    Read the article

  • 45 Different Services, Sites, and Apps to Help You Read Your Favorite Sites (Like How-To Geek)

    - by Eric Z Goodnight
    Ever wonder how geeks stay connected with their favorite blogs and writers? Read on to learn about RSS feeds and how easy they are to use with these 45 apps, services, and websites that can help you stay current. Note: of course, our more geeky readers are going to understand a lot of this already, which is why we included 45 great services that you might not have heard about before. Keep reading for more, or give you advice to the newbies in the comments Latest Features How-To Geek ETC HTG Projects: How to Create Your Own Custom Papercraft Toy How to Combine Rescue Disks to Create the Ultimate Windows Repair Disk What is Camera Raw, and Why Would a Professional Prefer it to JPG? The How-To Geek Guide to Audio Editing: The Basics How To Boot 10 Different Live CDs From 1 USB Flash Drive The 20 Best How-To Geek Linux Articles of 2010 Lord of the Rings Movie Parody Double Feature [Video] Turn a Webpage into an Asteroids-Styled Shooting Game in Opera Dolphin Browser Mini Leaves Beta; Sports New GUI, Easy Bookmarking, and More Updated Google Goggles Scans Faster; Solves Sudoku Puzzles Snowy Castle Retreat in the Mountains Wallpaper Fix TV Show Sorting Issues on iOS Devices

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >