Search Results

Search found 27592 results on 1104 pages for 'google sites'.

Page 510/1104 | < Previous Page | 506 507 508 509 510 511 512 513 514 515 516 517  | Next Page >

  • Installing WordPress with WebMatrix 2

    - by The Official Microsoft IIS Site
    If you’re getting started with Windows web development or you just need a lightweight web development tool then check out Microsoft’s WebMatrix 2 . Creating, deploying, and maintaining, web sites has never been easier and considering it’s free you can’t beat it. What I like about WebMatrix is that it allows you to install 3rd party products such as blogs or forums from the App Gallery. I needed to create a new WordPress blog so that I could test a few things without impacting my production...(read more)

    Read the article

  • ASP.NET MVC Cookbook - public review

    - by asiemer
    I have recently started writing another book.  The topic of this book is ASP.NET MVC.  This book differs from my previous book in that rather than working towards building one project from end to end - this book will demonstrate specific topics from end to end.  It is a recipe book (hence the cookbook name) and will be part of the Packt Publishing cookbook series.  An example recipe in this book might be how to consume JSON, creating a master /details page, jquery modal popups, custom ActionResults, etc.  Basically anything recipe oriented around the topic of ASP.NET MVC might be acceptable.  If you are interested in helping out with the review process you can join the "ASP.NET MVC 2 Cookbook-review" group on Google here: http://groups.google.com/group/aspnet-mvc-2-cookbook-review Currently the suggested TOC for the project is listed.  Also, chapters 1, 2, and most of 8 are posted.  Chapter 5 should be available tonight or tomorrow. In addition to reporting any errors that you might find (much appreciated), I am very interested in hearing about recipes that you want included, expanded, or removed (as being redundant or overly simple).  Any input is appreciated!  Hearing user feedback after the book is complete is a little late in my opinion (unless it is positive feedback of course). Thank you!

    Read the article

  • Find The Best Keywords For Your Site

    Keyword optimisation is probably the most important thing that you want to concentrate on with regards to search engine optimisation (SEO). Unfortunately, not many people know this, or do enough to optimise their sites' keywords.

    Read the article

  • Uniform url in different device

    - by yanglifu90
    I noticed almost all of StackExchange's sites uses the same url in mobile browser, I think this is cool because when I share something on my phone, people viewing the link would not see a mobile webpage on their desktop. What is this specification called by W3C? How do I find other websites that use this technology. I noticed that ArsTechnica and the Telegraph used the same url with their desktop version.

    Read the article

  • Track ping, download and upload daily

    - by euDennis
    I'm with some problems with my internet with oscillations in connection, causing some sites to get "Not Found" page sometimes. This isn't all the time, just some random times daily. My question is. There is any tool to monitor these basic information (ping, upload and download) daily to make an report and check the oscillations? Because, if someone from internet provider come at my house, probably it won't see the oscillations. Thanks, bye

    Read the article

  • meeting availability using iPhone/iOS calendars

    - by Jaymie
    Our management team all use iPhones with the built-in iOS calendar app. We're looking into getting Microsoft Exchange for everyone here, but that'll take us some weeks to plan and roll-out across the enterprise. In the meantime, I need to provide something so group meeting availability can be determined from those calendars. Google Calendar would be ideal ("find a time") if I could find a way to link to/export the iCloud calendars, but Apple don't seem to want to do that. I could ask the management team to recreate their events for the next few weeks in individual Google Calendars, but they won't want to do that and it's wasteful rekeying of data. Any suggestions you can provide will be gratefully received.

    Read the article

  • How to check if your email server looks like a spam source

    <b>Zona-M:</b> "When you start doing it though, you soon find out that the hardest, or at least lest documented task, is not how to send email, or how to block spam. It is how to make sure that the email you send is always accepted by other sites, that is how to find out if your email server looks like a spam source."

    Read the article

  • How do I prevent ISPs from killing downloads of files in mid-transfer?

    - by Gorchestopher H
    I run a small website with a few users, low traffic, mostly to share personal mp3 files with a small community. Depending on their ISP, my users can't always download or stream larger files. By larger I mean larger than 1MB. Essentially the host either stops sending, or the client stops receiving. One of the links along the connection chain simply ends its connection before the transfer completes Trace-route shows no connection issues. There are no connection issues with short transfers that don't take more than a few seconds. It's these 10 second transfers that just end up ending. Just doing a straight download with a direct link can yield this error if you have the wrong ISP. Strangely enough, this is most common with users with ISPs who are essentially independent providers that buy service via a fiber link. Unfortunately these providers aren't very knowledgeable, are unable to do any testing, and insist it's a problem with the host. I have gotten my host to transfer my site to different servers of their, to the same effect. Nearly identical sites (affiliate sites actually) experience no such issue. What can I be doing to further troubleshoot this matter? How can I prove that someone is dropping the ball, and identify who that party is? Can I do a 5Mb traceroute? EDIT Maybe I can clear up some misconceptions with my question: The files are not very large. They are simply over 2Mb. The users do not have "slow" connections, they are at least 5mbps. This "time out" happens very quickly, in the realm of 5 seconds, so I don't know if it's a timeout or not. The user often gets 1 or 2Mb in this chunk of time. I have tried streaming with a flash player. I have tried saving the target. Forcing the download. I have tried allowing the browser to stream the file. I have tried different browsers (FF, IE, Chrome). Users are able to download identical files when on different hosts.

    Read the article

  • Wireless slow, but very odd

    - by Logman
    I have a HP Pavilion dv6 laptop, internet through the wireless timeouts. Even though it connects fine to the AP/router. The computer had a virus (searchnu?) and I backed up the usual and restored the laptop to factory image. Problem is after the restore the internet through the wireless the same still... and connected via wired and everything worked fine. I was able to update the whole system. Internet worked perfect, speed great. But the wireless still was pitiful. Ping tests with wireless are interesting: Google= 18ms Gmail= 18ms Yahoo= 1023ms 8.8.8.8=30ms Microsoft= Request Timed Out Bing= Request Timed Out msn= Request Timed Out even though I get 18ms with google, the pages take a long time to load. Is this a root kit? is it the wireless card in the laptop?

    Read the article

  • Where would you implement the code to make a full screen webpage [on hold]

    - by Derek Drummond
    This will be my first time creating a website from the ground up and I would like to get some insight on how to implement a full screen site as well as some problems that may arise from it. I really like the design and layout of sites like uCast and spree. Since I am using ASP.net would this be implemented in the Master page or would this be implemented in the .ASPX file for each specific page on the site?

    Read the article

  • Revolutionize Your Business Using SEO

    A lot of new sites and businesses are emerging in the internet every single day. The need to make your website always easily visible, preferably on the first page of search engine results, is a must if you want not just for your business to stay but to succeed. Search Engine Optimization technique is one sure way to keep your site and business in a very good standing in spite of the competitions.

    Read the article

  • Get Your Web Site to the Top

    Search engine optimization began to be used widely in the 1990s. Businesses, whether their annual turnover was in the millions bracket or the thousands bracket, began to use SEO to attract more organic visitors to their sites.

    Read the article

  • How to give specific url using htaccess? [on hold]

    - by Dash
    I am a web developer using codeigniter.I want to give a specific url to certain pages on my website.Is it possible using htaccess?I visited following sites but couldn't find anything such there: Bluehost Tutplus and some others too. What i really wanna do is when the admin is logged in the link should be http://localhost/admin-ci/index.php/admin/index.php/dashboard and if user logs in then the link should be http://localhost/admin-ci/user/index.php/dashboard.Will htaccess be able to do this?

    Read the article

  • Make a Website Today and Expand Your Company

    Making a website with today's web tools is much easier than before. There are currently web hosting sites and website building services available online, all of which are dedicated to help you make a website of your own. In just a few minutes, you can make a website of your own without too many hassles and complications.

    Read the article

  • Benefits of Website Localization

    A website is the key to acquire a wide exposure or a mass reach in the web market. Businesses with an aim to expand their business and scale new heights require a strong web presence and a reach to the target audience. Website localization is the process that is most sought after by the businesses to attain a firm international presence as it helps in promoting sites in various languages.

    Read the article

  • When load balancing, must all copies of static web page be exactly the same?

    - by Gilles Blanchette
    I am used to get answers for everything on the web, but not this time... Yesterday I enable Amazon DNS weight functionally to load balance 7 websites between two different IP addresses (split 50%-50%). Both servers run IIS 8.5, sites runs well on both sides. Today I found out that Google WebMasterTools is reporting fails error with file robots.txt, all close to 50% of access try errors. The robots.txt file is ok and accessible (even via Google testing URL page) on both servers. Lets say current version of static web pages are on the first computer and the updated version of the same web pages are on the second computer. Can it be the problem? When load balancing, can static web pages be slightly different from one host server to the other? Thank you for your help

    Read the article

  • Browser keeps being really rude to me today

    - by j-t-s
    Hi All I've had this problem only once before, years ago. I bought a new computer the other day and last night I visited a website which Google Chrome suspected was an insecure site. So I proceeded to view the page anyway (Stupid, I know... But I was curious), and all of a sudden the window closed and ever since, every few minutes either Google chrome or Internet Explorer keeps popping up with random websites, most of which are porn-related sites. I have downloaded ZoneAlarm, IOBit 360, Eset Smart Security and none of them reported any problems. I still have the rube browser problem. Can somebody please suggest any software/ways to fix this? (Other than to reformat please :)) Thank you :)

    Read the article

  • Is it possible to use software raid in Windows 7 on the boot partition?

    - by DoctaJonez
    I want to use RAID 1 on my workstation configuration at work, and I've been looking at using the build in mirror functionality in Windows 7. When you click on the add mirror option it presents you with the following warning. I've done some Google searching and the consensus seems to be that you cannot boot from a dynamic volume, but some forum posts seem to indicate that people have tried this with success (e.g. here). With Google searches producing contradictory information I thought I'd ask you guys for an authoritative answer. Can I use the inbuilt Windows 7 mirroring for my boot partition? Or as I suspect, will it make it unbootable due to it being converted to a dynamic disk?

    Read the article

  • What do I do for dependencies installing wine1.7 on 14.04

    - by user285207
    user@chrubuntu:~$ sudo apt-get install wine1.7 [sudo] password for user: user Sorry, try again. [sudo] password for user: Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: wine1.7 : Depends: wine1.7-i386 (= 1:1.7.19-0ubuntu2~trusty2) but it is not installable Recommends: gnome-exe-thumbnailer but it is not going to be installed or kde-runtime but it is not going to be installed Recommends: ttf-mscorefonts-installer but it is not going to be installed Recommends: fonts-horai-umefont but it is not going to be installed Recommends: fonts-unfonts-core but it is not going to be installed Recommends: ttf-wqy-microhei Recommends: winetricks but it is not going to be installed E: Unable to correct problems, you have held broken packages. user@chrubuntu:~$ Trying to install wine1.7 on Ubuntu 14.04 64 bit, and i'm not sure what this means, help is greatly appreciated. I already ran sudo apt-get update and get this: Reading package lists... Done W: Duplicate sources.list entry http://dl.google.com/linux/chrome/deb/ stable/main amd64 Packages (/var/lib/apt/lists/dl.google.com_linux_chrome_deb_dists_stable_main_binary-amd64_Packages) W: You may want to run apt-get update to correct these problems So I run apt-get update and: E: Could not open lock file /var/lib/apt/lists/lock - open (13: Permission denied) E: Unable to lock directory /var/lib/apt/lists/ E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied) E: Unable to lock the administration directory (/var/lib/dpkg/), are you root? This is all very stressing because I have been trying to get Wine for the past week and had to reinstall and IT STILL WON'T WORK.

    Read the article

  • Tagging does not work with the Subversion plugin.

    - by mark
    I have exactly the same problem as the fellow from this post - http://jenkins.361315.n4.nabble.com/Tag-this-build-not-working-subversion-td384218.html, except that I use build 1.413 Unfortunately, the post does not provide any workarounds except downgrading to 1.310 (from 1.315) I would gladly provide the logs, if I knew the logger names. Please, help. P.S. I have posted this issue both on jenkins issues site - https://issues.jenkins-ci.org/browse/JENKINS-9961 and in the respective google group - https://groups.google.com/d/topic/jenkinsci-users/4UVKFxXA9Jo/overview. To no avail. So, this site is my last hope - thanks to all in advance. EDIT Upgraded to 1.417 - still tagging does nothing.

    Read the article

  • Squid throws error, The requested URL could not be retrieved

    - by Supratik
    Hi Sometimes I am getting the following error The requested URL could not be retrieved While trying to retrieve the URL: http://groups.google.com/ The following error was encountered: Unable to determine IP address from host name for groups.google.com The dnsserver returned: Refused: The name server refuses to perform the specified operation. This means that: The cache was not able to resolve the hostname presented in the URL. Check if the address is correct. Your cache administrator is root. What could be the reason for the above error ? Regards Supratik

    Read the article

  • How to Build Quality Back Links For Your Site

    Search engine optimization, known as SEO, relies on high quality back links In this regard; one should always know that their sites will rank higher in the search engines if they have quality back links, which is to say that one will receive organic traffic to their site from those who may be searching for what you're offering. To help you get quality back links, the following tips can help you.

    Read the article

< Previous Page | 506 507 508 509 510 511 512 513 514 515 516 517  | Next Page >