Search Results

Search found 24978 results on 1000 pages for 'publishing site'.

Page 207/1000 | < Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >

  • Securing internal data accessed by a website on the big, bad internet

    - by aehiilrs
    A close relative of this question on Stack Overflow: When you have a web site in your DMZ that needs to access production data stored on an internal DB, what strategies do you recommend using to lower the risks that come from accessing live data? Is it even considered acceptable to have a connection initiated from the DMZ come inside of your network? An extra detail about the nature of the site that kind of throws a monkey wrench into the machinery is that people using the web site will be competing for "spots" on a first-come, first-serve basis with others using the internal software. Because of this, as close to zero lag time between the two applications as possible is ideal.

    Read the article

  • News Applications internal working [on hold]

    - by Vijay
    How does news applications work other than RSS Feed based applications? I know some of them take the RSS content from the source site.But sometimes I see, those applications show - Title Description Date Image video etc. Even though when I see the original site's rss, image, video is not there in rss. So how does one get that to show in there applications? Some applications even shows feeds from magazine sites, newspaper sites. How do these applications work? I am creating an application which will link to different news sites feeds categorized (like top news, technology, games, articles etc.) On the front page it will show the website names, then on selection of any news site it will get the feed from that website and show it to user. So I would like to know All the fetching of data from should be done on user selection or data should be prefetched? Detailed information I want to fetch from the original like provided in the rss data. How should I go about it?

    Read the article

  • Long domain lookup on .dev domain inside vmware

    - by skelle
    I'm developing on my macbook and normally I have a local running webserver which just works finde. Now I have to use a vmware image where the webserver is running. I set up everything and my dev site is running under site.dev inside vmware. I can connect to the webserver but EVERY request takes a very long time. I already red that this is related with iIPv6 and the way OSX handles /etc/hosts. There I added 192.168.155.42 site.dev and I already did this (Resolving to virtual host very slow on Mac OS X Lion) but my lookup still takes ~30seconds on every request. What can I do to fix this issue?

    Read the article

  • Beginners SEO Primer

    These SEO do's and don'ts were created for beginners. But, even if you have been publishing websites for a while, it never hurts to refresh your memory. First, you will see a little basic information, followed by my suggestions.

    Read the article

  • How to auto-update a website mirror with exceptions to certain pages?

    - by tomatosalad
    I'm currently mirroring a website on my server. The site itself is rarely updated, but it is updated enough that info can become outdated quickly. I mirrored it first with wget, and this worked fine, but I made some changes: The original index.html used frames, but the site also provides a main.html which is essentially index.html but without frames. I deleted index.html and renamed main.html. I did not want to mirror the webchat, blog or forum, so I deleted those files and directories and made directories "blogs" "forum" and "chat" and placed a php redirect in each of those, redirecting visitors to the orignal site. I'd like to auto-update the mirror (maybe once every 24-72 hours), but preserve the changes I made. Is this possible? How would I go about doing it? I am completely clueless as to how. Thanks for any and all help! :)

    Read the article

  • How do spambots work?

    - by rlb.usa
    I have a forum that's getting hit a lot by forum spambots, and of course the best way to defeat something is to know thy enemy. I'll worry about defeating those spambots later, but right now I'd like to know more about them. Reading around, I felt surprised about the lack of thorough information on the subject (or perhaps my ineptness to input the correct search terms for better google results). I'm interested in learning all about spambots. I've asked on other forums and gotten brush-off answers like "Spambots are always users registering on your site." How do forum spambots work? How do they find the 'new user registration' page? (I'm especially surprised because some forums don't have a dedicated URL for this eg, www.forum.com/register.html , but instead use query strings or even other methods invisible to the URL bar) How do they know what to enter into each 'new user registration' field? How do they determine what's a page they can spam / enter data into and what is not? Do they even 'view' this page at all? ..If not, then I'd assume they're communicating with the server directly - how is - this possible? How do they do it? Can forum spambots break CAPTCHAs? Can they solve logic questions (how?)? Math questions? Do they reverse-engineer client-side anti-bot validation scripts? Server-side scripts? What techniques are still valid to prevent them? Where do spambots come from? Is someone sitting behind the computer snickering as they watch their bot destroy site after site? Or are they snickering as they simply 'release' it onto the internet somehow? Are spambots 'run' by an infected computer somewhere? Do they replicate themselves? etc

    Read the article

  • Jail user to home directory while still allowing permission to create and delete files/folders

    - by Sevenupcan
    I'm trying to give a client SFTP access to the root directory of their site on my server (Ubuntu 10.10) so they can manager their website themselves. While I have been successful in jailing a user to a directory and giving them SFTP access; they are only allowed to create and delete new files in sub directories (the directories they own). This means that I must give them access to the parent directory to the root of their site. How can I limit them to the root of their site (for example public_html) while still allowing them the ability create and delete files. All the tutorials I have read suggest that the root must be the owner of the user's home directory, which prevents them from write access inside that directory. I'm relatively new to managing my own server so any advice would be very grateful. Many thanks.

    Read the article

  • stop apache from asking for SSL password each restart

    - by acidzombie24
    Using instructions from this site but varying them just a little i created a CA using -newca, i copied cacert.pem to my comp and imported as trusted issuer in IE. I then did -newreq and -sign (note: i do /full/path/CA.sh -cmd and not sh CA.sh -cmd) and moved the cert and key to apache. I visited the site in IE and using .NET code and it appears trusted, great (unless i write www. in front which is expected). But every time i restart apache i need to type in my password for the site(s?). How can i make it so i DO NOT need to type in the password?

    Read the article

  • Default document not working after installing SP1 on Windows 2008 R2 x64

    - by boredgeek
    We have a web site that should only be available for authorized users. So we deny anonymous access for the site. However we do allow anonymous access to the default page and the login page. When we installed SP1 the behavior of the server changed. Now if the user is trying to access the root of the site, say http://mysite.com, she is redirected to login page rather than the default page. Is there a hotfix to bring back the previous behavior?

    Read the article

  • Slow slef hosted wordpress website

    - by Integrati Marketing
    Hi All, we have a great site which has been humming along nicely for about 5 months and then in May it went from a page load speed time of 3-5 secs to now an agonising 15+ secs!!! The host has been really helpful and has even shifted the site to a new server which is faster! I guess seeing as though we do not have the insight or your expertise we would ask the Serverfault community and see what this crowd of experts could recommend? Appreciate any insight, thank you. site is here: integrati.com.au Cheers. :)

    Read the article

  • PHP hosting some info required [closed]

    - by mtk
    I have recently given a control of newly bought hosting space and the domain account. There is a technical team from the hosting site to help out with problems, but that is a long process, i.e. log a ticket, wait for a long time, and I don't get the correct answer in the first shot. I was wondering, if anyone has any helpful guide and how one must go with hosting a site. Any info that must be know w.r.t to cpanel? Any other useful stuff if any one has, or could point me to ? Just to give a few difficulties: The same php code working well on local machine, giving error on remote as "File not found". The file is present indeed as I have ftp'ed all the files correctly. session_start error are outputted to html page with warning "Header already sent". and many more technical things, that work well on local but not on actual hosting server. So, if anyone has any helpful stuff in this reference, as to what all changes are required or what a programmer must be aware from a hosting perspective, please let me know. Note I am hosting a PHP site with mysql db, on a shared environment.

    Read the article

  • Serve up syntactic XHTML5 using the text/html MIME type?

    - by cboettig
    I have a site currently written with HTML5 tags. I'd like to be able to parse the site as XML, with support for namespaces, etc, to facilitate programmatic extraction of data. Currently I have <!DOCTYPE html> and <meta charset="utf-8"> Which I gather is equivalent in HTML5 to explicitly setting the content-types as <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> for my current setup. In order to serve XML it sounds like the right thing to do is <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> Should I also change my Content-Type to <meta http-equiv="content-type" content="application/xhtml+xml; charset=iso-8859-1" /> Or is that not necessary? What is the advantage of having content-type be "application/xhtml+xml"? What is the disadvantage? (Sounds like it may break internet explorer rendering of the site? but maybe that information is out of date now?) Many thanks!

    Read the article

  • Issues With IIS Hosting Two Domains From Same Folder

    - by Bob Mc
    I have two different domain names that resolve to the same ASP.Net site. Both domains are hosted on the same server, which runs Windows Server 2003 and IIS6. The sites are differentiated in IIS Manager using host headers. However, both of the sites point to the same folder on the local drive for the site's page files. I am occasionally experiencing an ASP.Net error that says "The state information is invalid for this page and might be corrupted." I'm the site developer so I've addressed all the relevant code-related causes for this issue. However, I was wondering whether having two domains/sites sharing the same folder for an ASP.Net application might be causing this intermittent error. Also, is this generally a bad practice? Should I make separate, duplicate folders for each of the domains? Seems like that can become a maintenance headache.

    Read the article

  • Slow self hosted wordpress website

    - by Integrati Marketing
    Hi All, we have a great site which has been humming along nicely for about 5 months and then in May it went from a page load speed time of 3-5 secs to now an agonising 15+ secs!!! The host has been really helpful and has even shifted the site to a new server which is faster! I guess seeing as though we do not have the insight or your expertise we would ask the Serverfault community and see what this crowd of experts could recommend? Appreciate any insight, thank you. site is here: integrati.com.au Cheers. :)

    Read the article

  • How do I reduce RAM usage on my server?

    - by Abs
    I have recently launched a site that is very popular but I am having trouble with scalability. My site makes heavy use of FFmpeg and at peak times RAM usage hits the 2 GB point quickly and the swap file starts getting used. CPU usage starts rising too. Users complain that the site is slow. They say this because all FFmpeg instances run very slow because of the number running at the same time. Users make use of FFmpeg on my server in real time. Is there anything I can consider or do to ease down the usage of the server and RAM just shooting up? Maybe there is something better than FFmpeg (!). Is the only solution "throwing some cash" at a more powerful server? I have given little information, please ask for more, so this problem can be solved.

    Read the article

  • Certain websites do not open. Instead a 1*1 image gets displayed

    - by Ranjith - SR2GF
    When I try to visit certain websites like www.bidvertiser.com, www.buysellads.com, a white page shows up, the title bar displays the site name followed by (1x1) in brackets. When I right click, 'View Source' option appears disabled. The Save As.. option shows the file type to be gif. However, when I preview the site in Google search results ( by moving the mouse over ) the screenshot of the site displays well. This happens on all the three browsers on my computer: Chrome, Firefox and IE. What is the problem and how can it be resolved? EDIT: At some point of time, they probably worked on my computer! I think it is a more general problem. The same happens when I click on certain links in Google search results.

    Read the article

  • Setting up MySQL Replication for High Availability

    <b>PACKT Publishing: </b>"MySQL Replication has been supported in MySQL for a very long time and is an extremely flexible and powerful technology. Depending on the configuration, you can replicate all databases, selected databases, or even selected tables within a database."

    Read the article

  • problem with accessing a php page

    - by EquinoX
    So I have a info.php page which is located on the folder /var/www/nginx-default, however when I go to my ip address/info.php, it always redirects me to this site: http://www.iana.org/domains/example/ is this because I have a virtual host that I called example? Here is my config for the example website: server { listen 80; server_name www.example.com; rewrite ^/(.*) http://example.com/$1 permanent; } server { listen 80; server_name example.com; access_log /var/www/example.com/logs/access.log; error_log /var/www/example.com/logs/error.log; location / { root /var/www/example.com/public/; index index.html; } } The way I access this site is by changing my /var/hosts in my macbook so that example.com is mapped to my server IP address... however now when I do xxx.xxx.xxx.xxx/info.php.. it redirects me to that site I posted above

    Read the article

  • Setting a mapped drive in Virtual hosts causes apache to not start

    - by darksoulsong
    I´m trying to set a virtual host on my windows 7 machine. The folder I want to point to is located on a centOS machine and the folder path is Z:\Websites\Online\MyClient\Site. But something strange happens when I set the document root like this: DocumentRoot "Z:\Websites\Online\MyClient\Site" Apache do not restarts after that. When I take a look at the log, there is an error pointing to that line, where I added the path to the folder: Syntax error on line 48 of C:/Program Files/Zend/Apache2/conf/extra/httpd-vhosts.conf: DocumentRoot must be a directory. There must be a way to make it work like this, by setting an Apache Installation on a machine and pointing it to a folder located on another computer, right? My hosts file is set like this: 172.17.10.1\Data\Websites\Online\MyClient\Site MyClient.local ANY HELP would be VERY appreciated.

    Read the article

  • What constitutes "commercial purposes"?

    - by RoboShop
    I'm looking at this license. It says that I can use it for "non-commercial purposes". What does that mean? I see in Stack Exchange, under Network Profile there is that graph that tracks your points across your Stack Exchange accounts. It uses a control called HighCharts which have a paid and Creative Commons licensed version. So would Stack Overflow constitute a commercial site? We don't pay to use this site, but obviously the site makes money from ads, etc. Then again, there's a lot of sites that have ads who won't necessarily make a profit, it may only be subsiding their costs. But even then, you could argue that even if it is only subsiding their costs, a lot of IT companies run at a loss in order to build a big enough customer base. So where is the line here? Is it any website on the internet? Is it any website that has ads? Is it any website that turns over a profit?

    Read the article

  • Remote additional domain controllers

    - by user125248
    Is it possible to setup several additional domain controllers (ADC) at remote locations that are connected via medium bandwidth DSL (2-10 Mbit) WAN connections for a single domain (intranet.example.com)? And would it be a good idea? We have five sites and would like to have extremely high availability if any of the site were to lose their Internet connection. However each site is very small, and all are over a fairly small geographical area within the same region, so it would seem strange to have a PDC for each of the sites. If it were possible to have an ADC for each site, would the clients use the ADC or just use the PDC if it's available to them?

    Read the article

  • sites now not responding on port 80 [closed]

    - by JohnMerlino
    Possible Duplicate: unable to connect site to different port I was trying to resolve an issue with getting a site running on a different port: unable to connect site to different port But somehow it took out all my other sites. Now even the ones that were responding on port 80 are no longe responding, even though I did not touch the virtual hosts for them. I get this message now: Oops! Google Chrome could not connect to mysite.com However, ping responds: ping mysite.com PING mysite.com (64.135.12.134): 56 data bytes 64 bytes from 64.135.12.134: icmp_seq=0 ttl=49 time=20.839 ms 64 bytes from 64.135.12.134: icmp_seq=1 ttl=49 time=20.489 ms The result of telnet: $ telnet guarddoggps.com 80 Trying 64.135.12.134... telnet: connect to address 64.135.12.134: Connection refused telnet: Unable to connect to remote host

    Read the article

  • Windows Server: Do I really need servers in remote locations?

    - by IMAbev
    I have one main site with several servers an a 2008/2012 environment. I have 4 remote sites that are physically close (a few miles apart) and are all connected to the main site by 20meg fiber on a private network. At each of the remote locations I have a windows server that users log in to and where their files and apps are located. There are many considerations to answering this question. But the first thing I am wondering is do I really need a server at each location? Users are just logging in to this server for permissions and a vast majority of my users are only using word, excel and email. I am really interested in figuring out if I need servers at these locations. $3,000 to $4,000 per server every 3-5 years, licensing, administration... I know there are other considerations - speed, redundancy, if my link to the main site goes down the users have nothing. But I just am not convinced I need servers at these locations.

    Read the article

  • Public domain usage of imagery from films? [on hold]

    - by AdamJones
    I'm thinking of starting a small film site, which would begin as a simple blog. Imagery from films I discuss on the site would be vital to the look and feel of this site. Instantly though this makes me wonder about copyright/public domain rights for such imagery. I just wondered if anyone had general or specific advise about using imagery from this industry or another similar situation? On the one hand I know the film industry aggressively tries to protect its IP (fair enough), but on the other hand, surely film companies do release some imagery of their films in stills format into the public domain to simply help their distribution and advertising efforts? I have tried looking on stock photo galleries for images of film stills but only found moviestillsdb.com) which seemed very limited in its results. I've researched a bit about fair usage (http://fairuse.stanford.edu/overview/fair-use/) as well, which I know applies to the USA specifically. This seems to suggest that a still of a film is within these bounds. Still, any constructive advise others may have as a result of experience dealing with imagery, from film or another domain would be greatly appreciated, assuming it isn't "get a lawyer".

    Read the article

< Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >