Search Results

Search found 25091 results on 1004 pages for 'mobile friendly website'.

Page 160/1004 | < Previous Page | 156 157 158 159 160 161 162 163 164 165 166 167  | Next Page >

  • Online PIM Needed: EverNote + Cozi = ???

    - by Shoeless
    I am looking for an online solution with free-form note-taking ability like OneNote or EverNote, and also a robust calendaring system (tasks, repeating appts, notifications). Cozi has a great calendar but not much else... EverNote lasks the calendar side of things. Scrybe looks very promising on both fronts but is by invitation only. Oh yeah... it should be free, too :D Am I SOL?

    Read the article

  • Block a URL at browser level

    - by Farseeker
    Does anyone have a solution (that doesn't involve editing the hosts file) to block a particular URL from FireFox? Basic back story is that I'm trying to discipline myself. I'm spending FAR too much time over at Server Fault that I want to genuinely block the site from my work PC so that every time I find myself flicking to it during work time I can't see it, but I'd like to be able to disable it during my lunch break. (So I only spend 40 minutes a day there, rather than 4 hours). That said I don't want to block it at the router, nor for anyone else.

    Read the article

  • What WYSIWYG software can I use to create a web page?

    - by Roman
    I always made web pages by writing HTML code but now I would like to try to use some WYSIWYG approach. Can anybody recommend me a program which I can use for that? I mean a program in which you can move buttons, tables, pictures by mouse. You can change size and shape by mouse. You can use nice templates for "block of text", buttons, background and so on. I am using Windows 7. May be I already have something pre-installed?

    Read the article

  • Where can I collaborate with my friend on source code in real time?

    - by Carson Myers
    I mean, other than a conference room :) Using google docs, I can upload any kind of file and view it with other people, watch them edit it in real time, with a live chat happening in the same window. This is awesome. How can I do the same thing with source code? I'm looking for a web application where I can upload source files that will be displayed in some kind of editor, with syntax highlighting, and allow others to view it and edit it in real time. Preferably with a live chat also, but not necessary. Does anybody know where I can find this?

    Read the article

  • Can't access some websites with any browser

    - by Charles Kingsmill
    I'm running Windows 7 64-bit on a new Samsung laptop and accessing the internet okay via ethernet cable to my university's ISP. Some sites work fine (e.g. google.com) but I can't access others at all (microsoft.com, topshop.com). I can't connect to those sites in safe mode with networking. And ping and tracert both fail. There's no proxy. Other users can connect successfully to these sites using my cable and socket. I've tried all the following with no success: using various browsers (IE9, FF, Chrome) creating a new user updating drivers clearing the DNS cache using OpenDNS and Google's DNS turning off Avast tweaking the MTU running MS malicious software removal tool running Spybot S&D reviewing the hosts file disabling the IPv6 options repairing / resetting winsock settings disabling advanced javascript options I have run out of ideas... can anyone see anything I've missed??!

    Read the article

  • Xpath automatization software

    - by holms
    Too sad this topic was closed. But I'm kind of a having the same question. I want to construct xpathes, for common html block which appears on page. For example: you can give two URLs to that software, which will contain SAME html blocks (divs) , but having different content in it. by giving 2 stackoverflow.com url's, software could detect that same div#id is being used once again, and just give XPATH'es of those html blocks like for example. Of course I can find xpath'es my self, as far as I remember, firebug makes it easy,shows xpath of every html block, but this is kind of hard procedure if you want to get xpath'es for LOTS of html elements. so that's why I want this kind of software to help in this routine.

    Read the article

  • block certain websites from browser

    - by phunehehe
    Hello there, A friend of mine (who is not a geek) asks me how to stop her little brother from playing web games on her computer. She is currently using Chrome and IE, and I have never done that before, even on FF. I would prefer a solution that is simple and does not require additional applications. Although it seems unlikely, is there a solution that works for all browsers (i.e. do it once and I never have to fix it for a new browser)? Thanks.

    Read the article

  • My DNS cannot resolve an web site address?

    - by ipkiss
    Hello all, Recently, I could not access the webpage bbc.co.uk anymore, while I can access other websites smoothly. Ar first, I though there may be some problem with my laptop. However, if I use my laptop through my company network, I can load the page bbc.co.uk normally. Then, I though maybe my ADSL at home blocks that web address. However, I tried another laptop with my home ADSL and it can load the page bbc.co.uk very fast. Now I do not know what could be the problem. Can anyone tell me please? Thank you.

    Read the article

  • Command-line HTTP crawler for Windows?

    - by Pekka
    Would somebody have a recommendation for a web site crawler that can be invoked and equipped with settings from the command line? This would need to run in a Windows environment. Saving the data, following stylesheet links etc. is not an issue. I only need the crawler to start with a page, parse it, and follow all the links on the same domain so that in the end, all pages on the site have been requested once. Background: I'm setting up a web site that gets frequently uploaded from an office location. Combining data from various sources, it has several levels of caching. I don't want the first user to visit the site after a fresh upload to have to wait until the page has been generated and saved in the cache.

    Read the article

  • public family tree

    - by Remus Rigo
    Hi all Does anyone know a ancestry site that allows you to create a public profile or tree, so that other visitors can see your family tree. In all sites that I have found (dynastree.com, familylink.com, ancestry.com, genebase.com), if someone wants to see your family tree, they must be members or register. thanks

    Read the article

  • Client certificate based encryption

    - by Timo Willemsen
    I have a question about security of a file on a webserver. I have a file on my webserver which is used by my webapplication. It's a bitcoin wallet. Essentially it's a file with a private key in it used to decrypt messages. Now, my webapplication uses the file, because it's used to recieve transactions made trough the bitcoin network. I was looking into ways to secure it. Obviously if someone has root access to the server, he can do the same as my application. However, I need to find a way to encrypt it. I was thinking of something like this, but I have no clue if this is actually going to work: Client logs in with some sort of client certificate. Webapplication creates a wallet file. Webapplication encrypts file with client certificate. If the application wants to access the file, it has to use the client certificate. So basically, if someone gets root access to the site, they cannot access the wallet. Is this possible and does anyone know about an implementation of this? Are there any problems with this? And how safe would this be?

    Read the article

  • wget not converting links

    - by acrosman
    I am trying to mirror a fairly large site (20,000+ pages) prior to a major overhaul. Basically, I need a backup before cutting over to the new one in case we forgot something we need (we'll have about 1,000 pages at launch). The site is run on a CMS that I cannot easily extract usable data from, so I'm trying to make the copy with wget. My problem is that wget does not appear to be actually converting links, despite the presence of --convert-links or -k in the command. I've tried a couple of different combinations of flags, but I haven't been able to get the output I need. Most recent failed attempt was: nohup wget --mirror -k -l10 -PafscSnapshot --html-extension -R *calendar* -o wget.log http://www.example.org & I've also included the --backup-converted, and --convert-links instead of -k (not that it have mattered). I've done it with and without -P and -l, again no that they should matter. Results in files that still have links like: http://www.example.org/ht/d/sp/i/17770

    Read the article

  • Multiple personalities for a blog

    - by Ralph Rickenbach
    I am using Blogger.com and service multiple websites. I would like to display the content of one single blog on all sites, using url's like blog.sitename.xxx and the sites corporate identity. They are rather different, but a solution that takes specific css would suffice as an absolute minimum. Better would be to have different templates. Any solution?

    Read the article

  • Any way to stop people from img "framing" your site?

    - by Yegor
    Someone was trying to get cute with me, by "iframeing" my search result page via an IMG tag with 0 width and 0 height, in hopes of killing my server resources. My searches are cached, so it doesn't do much damage, since its just a static file being served, but I was wondering if there was anything I can do to "fight back"? I know you can use a frame breaker, had it been an iframe. Is there anything to do in the case of an image?

    Read the article

  • Caching Reverse-Proxy ISP Host for a Low-Bandwidth Server

    - by Casey
    I am building a webcam w/ HTTP server that will be running from a low-bandwith connection. The content on the site will be changing every 5 to 10 minutes. Instead of serving files directly from this connection, are there hosting companies that can act as a reverse proxy for my site? Therefore, if nobody is using the site, the local internet connection remains idle. And if I receive 1000 hits all at the same time, only one HTTP GET is required, and the hosting company (on a fat pipe) continues serving the other 999 requests? This doesn't sound like a very common usage model, but I feel like this would be the optimal solution to my situation.

    Read the article

  • What is an application for "web site recognition"?

    - by OSX Jedi
    This explanation isn't clear to me. Let me describe an application for web site recognition. Suppose that we want to know what everyone is doing with the web at starbuck. We can use wireshark or other programs to sniff all the packets. By grouping all the secondary connections with the primary one, then we would be able to get a much easier picture of user's primary activities. Is this talking about being able to recognize which websites each of the laptops are connecting to?

    Read the article

  • Suggestions for a web hosting site (both PHP and ASP.NET)

    - by eibhrum
    Hi, I'm just wondering if there's a web hosting site that offers hosting for PHP and ASP.NET at the same time. It would be great if you give me a site that offers free service. I would like to use it for testing purposes only. But I could still look for affordable one. Comments/Suggestions are welcome. Thanks.

    Read the article

< Previous Page | 156 157 158 159 160 161 162 163 164 165 166 167  | Next Page >