Search Results

Search found 47251 results on 1891 pages for 'web storage'.

Page 376/1891 | < Previous Page | 372 373 374 375 376 377 378 379 380 381 382 383  | Next Page >

  • An international mobile app - Should I set up EC2 instances in multiple regions?

    - by ashiina
    I am currently trying to launch an mobile app for users around the world. It is not a spectacular launch which will get millions of users in weeks - just another individual developer releasing an app. I know enough about the techniques of managing timezones, internationalizing string, and what not ( the application layer ). But I cannot find any information on how I should manage my EC2 instances... Should I be setting up EC2 instances in different regions around the world? Is that a must-do, or is it an overkill? I'm aware that it's the ideal solution in terms of performance, but it becomes very tough managing servers in multiple regions. DB issues, AMI management, etc... I'd much rather NOT do so. So I would like to know the general best practice when launching an international app/website. Note: For static contents, I know it's better to use a CDN, so I'm planning on doing so.

    Read the article

  • how to publish c# project online

    - by hero
    i have done a simple c# website that have a lot of pics (3000+). i want to publish the full website online. all i know is that i have to first buy a domain name, and then buy some space on a webhosting server. could anybody guide me? how much does a domain name usually cost? and also how should i upload the website on a hosting server? i mean should i upload the full code?

    Read the article

  • Best program to conduct a presentation for investors

    - by the_drow
    Hello, I need a free program that follows those specific requirements: Can load Word/Powerpoint files and let the presenter control them Allows Voice/Video to be transmitted to all participants Is free (recommended but not required) or very cheap Is easy to install Anyone got recommendations for me?

    Read the article

  • Problem with the hosts file under windows 7

    - by martani_net
    I updated some entries in the hosts file "C:\WINDOWS\System32\drivers\etc" to make google for example point to 127.0.0.1 # Additionally, comments (such as these) may be inserted on individual # lines or following the machine name denoted by a '#' symbol. # # For example: # # 102.54.94.97 rhino.acme.com # source server # 38.25.63.10 x.acme.com # x client host 127.0.0.1 localhost ::1 localhost 127.0.0.1 google.com This works fine under windows Vista, but not under Widows 7. When I type google, it goes directly to Google's website. For info, I am not using a proxy server. I think there are some temporary DNS settings that must be flushed, but I don't know how, anyone knows how to fix this? Thank you.

    Read the article

  • Better urls for this internal web server?

    - by sprugman
    I've got a server that I have admin access to, but don't fully manage. (I think it's a virtual machine, but I'm not 100% sure. It's running Apache on Windows Server 2003.) I share the ip with another user, so my sites all have to use the :8080 port. This is kind of ugly. Also, AFAIK, the only access I have is through an ip address. (I'm inside a corporate firewall and don't think I have access to a DNS server or anything.) I've adjusted my hosts file so I don't have to use the ip address on my local machine, but that's not a very generic solution. Are there any options to 1) get rid of the port requirement 2) be able to use a name (maybe a machine name) instead of the ip address in a generic way? (I'm not really a network admin -- I'm a developer managing this machine. The IT folks who really manage it are a few people away from me and tough to get to do anything, so I'm looking for a light-weight solution if possible.)

    Read the article

  • How do I restrict access to certain web files/folders on an IIS 7.5 based web server?

    - by cpuguru
    We're moving a website that was previously hosted on Win2k3 & IIS 6 to a Win2k8 R2 & IIS 7.5 platform. The website is public, but we want to restrict anonymous access to certain files and folders such that the user would be prompted for a password to access them. If this were Apache, a simple .htaccess file would serve the purpose. However, since it's IIS 7.5 and we're serving up mainly static HTML files and a few classic ASP pages I'm in a bit of a quandry as to how to restrict access to individual files and folders for various committees such that attempts to committee_1's files and/or folders would prompt the user for a password and, if entered correctly, would serve up their files. Same thing for committee_2 and so on. Under IIS 6, we would take away the read privileges for IIS_IUSRS and create a user called "committee_1" with a password known by the group and give that user read privileges to the files/folders. There's got to be a better (and more secure) way. Reminder, these are not *.aspx pages that are being served up. Any suggestions on how to password protect key files and/or folders under IIS 7.5 are much appreciated.

    Read the article

  • Does searching a keyword on Google make the crawlers look harder in the future?

    - by Foo Bar
    Do the search requests made by the users influence the Google crawlers "attraction" by this keyword? Let's say Google has some hits on a specific keyword in the search index. And now I search for exactly this keyword. Will the Google crawlers react to the search and keep looking more intense for pages that could match this keyword? A reason why this could be important: Privacy when searching yourself. Assume you just want to know how much Google (and thus other people) can find out about you. If now any (statistical) additional search for your name trigger the crawlers even one step harder to find even more about you, it would have the negative effect that you would actually be found easier in the future, even though you had the intention and hope to find out how few Google finds about you. It's a bit like the dillema in quantum mechanis: Does observing the system automatically change the system?

    Read the article

  • Reading log files from web application

    - by Egorinsk
    Hi! I want to write a small PHP application for monitoring logs on a Debian server, including syslog logs and Apache/PHP messages. The problem here is that Apache user (www-data) has no access to /var/log directory. What would be the best way to grant an access to logs for PHP application? Let's assume that log files can be really large, like hundreds of megabytes. I have some ideas: Write a shell script that would be run via sudo and tail last 512 Kb of log into a separate file that can be read by application - that's ineffective, because of forking a new process and having to read data twice Add www-data to adm group (that can read logs) - that's insecure Start a PHP process via cron every minute to read logs — that's not very good, because it doesn't allow real-time monitoring. Also, this script will be started even when I don't read logs, and consume CPU time (server is in the cloud, and I'll have to pay for it) Create a hardlink for all log files with lowered permissions - I guess, that won't work because logrotate could recreate log files and they'll change inode number. Start a separate nginx/Apache server under privileged user that may read logs. Maybe anyone got a better solution?

    Read the article

  • ec2 spot instance for daily processing task

    - by chaft
    I don't have much experience as a sysadmin or with amazon aws, so I hope someone can explain in simple terms or refer me to a good guide on how to achieve the below. I have a system running on ec2 and amazon rds getting data in and saving it to the db. I need to run a script once a day (at the end of the day) to process all that data and prepare a daily report. This process will take approximately an hour to run. It needs to run on a high memory instance.. From what i've read so far, I guess the best way to do it is to have a high memory spot instance run every day, set it up to execute the script on startup and and shut down when done. Is that the right way to do it? If so, how to do it? how to tell the spot instance to run every day? through a cron job on the other server or is there a better way? How to set it up to run the script on startup? through cloudinit? Any help would be appreciated. One last thing, the job is not very time sensitive as long as it runs every day.. thanks

    Read the article

  • Show symbolic links AND their targets in web directory listing (apache)

    - by Erwan Queffélec
    Listing a directory content with ls -l shows this output: total 12 drwxr-xr-x 3 root root 4096 Dec 11 16:38 2.3 drwxr-xr-x 5 root root 4096 Dec 11 16:38 2.4 drwxr-xr-x 2 root root 4096 Dec 11 16:38 archive lrwxrwxrwx 1 root root 10 Dec 11 16:38 current -> 2.4/2.4.1/ lrwxrwxrwx 1 root root 10 Dec 11 16:38 next -> 2.4/2.4.2/ lrwxrwxrwx 1 root root 10 Dec 11 16:38 previous -> 2.4/2.4.0/ Notice how it shows the symbolic links and their respective targets. I need to know if there is a way of getting the same behaviour in apache directory browsing. If apache is not capable of it as I suspect, is there an application (FLOSS) providing that kind of behaviour ?

    Read the article

  • Debian Squeeze - Monitor outgoing traffic

    - by Sam W.
    I have a small webserver that running on Lighttpd 1.4 which steadily uses 250GB or less bandwidth for the past couple of months. But since May the traffic spikeed to more than triple of what it was. Nothing special was on my site to make its spike like that. When I checked with vnstat I found that 70% of the bandwidth is tx. I suspect I've been hacked and my webserver is becoming some sort of bot. ClamAV comes out with nothing and I already replaced the Joomla installation with a fresh one, early in June. But right now the traffic stayed the same. My question, how can I monitor my server and look what is transmitting all that data out? My need to be done to pinpoint what is the culprit. Can someone please point to the right way to solve this? Thank you.

    Read the article

  • How should I configure my Apache Hosts File to serve a different site for localhost than for my domain/publicip?

    - by rofls
    I'm trying to test out a LAMP (with PHP5 specifically) setup with Django already serving a website. I want to do the PHP stuff on localhost for now, so that when I do something like this: curl http://localhost/database/script.php?var=1, I get a response from the php server. Right now I'm getting a Django error. I tried something like this in the default file in sites-available: Listen 80 <VirtualHost aaa.bbb.ccc.ddd> ServerName localhost DocumentRoot /home/phpsite </VirtualHost> where aaa.bbb.ccc.ddd is the local ip address, and changing my actual site's settings to specify the public ip, like this: Listen 80 <VirtualHost www.xxx.yyy.zzz> ServerName mysite.com DocumentRoot /srv/www/mysite WSGIScriptAlias / /srv/www/mysite.wsgi </VirtualHost> but then I start getting all kinds of errors when I start apache, such as port ::[80] is already in use or something. I noticed that the hosts file that's located in /etc/apache2/ is apparently pointing everything to mysite.com, including my local ip as well as 127.0.0.1 and 127.0.1.1; Do I need to change the configuration there too?

    Read the article

  • How much it costs to run own hosting server

    - by Mirage
    I currently have VPS in my company and there i host about 20 websites. My company wants to set up server locally where they can host all websites rather using 3rd party VPS How it will cost e,g about upload ,download speed from data centre. Cpanel license IP registration, hardware , backups, electricity backups, Any other costs etc I would prefer centos

    Read the article

  • Logs show lots of user attempts from unknown IP

    - by rodling
    I lost access to my instance which I host on AWS. Keypairing stopped to work. I detached a volume and attached it to a new instance and what I found in logs was a long list of Nov 6 20:15:32 domU-12-31-39-01-7E-8A sshd[4925]: Invalid user cyrus from 210.193.52.113 Nov 6 20:15:32 domU-12-31-39-01-7E-8A sshd[4925]: input_userauth_request: invalid user cyrus [preauth] Nov 6 20:15:33 domU-12-31-39-01-7E-8A sshd[4925]: Received disconnect from 210.193.52.113: 11: Bye Bye [preauth] Where "cyrus" is changed by hundreds if not thousands of common names and items. What could this be? Brute force attack or something else malicious? I traced IP to Singapore, and I have no connection to Singapore. May thought is that this was a DoS attack since I lost access and server seemed to stop working. Im not to versed on this, but ideas and solutions for this issue are welcome.

    Read the article

  • Uptime concerns in case of AWS outage

    - by Aditya Patawari
    I am running an Elastic Load Balancer backup by 2 instances in different Availability Zones in US East. I am using Multi-AZ RDS as well. Ideally this should ensure that if one AZ goes down, it should not effect the app because everything is spread across multiple AZs. But the recent AWS outage took the app down for a long time. I am not sure how this can happen. It would be great if someone can point out what went wrong. Major question here I have is how can I avoid this in future? I can setup app servers across different regions or even providers and use DNS for load balancing but what do I do with MySQL? Read Replicas will introduce some lag which I would want to avoid.

    Read the article

  • Port 80 not accessible Amazon ec2

    - by Jasper
    I have started a Amazon EC2 instance (Linux Redhat)... And Apache as well. But when i try: http://MyPublicHostName I get no response. I have ensured that my Security Group allows access to port 80. I can reach port 22 for sure, as i am logged into the instance via ssh. Within the Amazon EC2 Linux Instance when i do: $ wget http://localhost i do get a response. This confirms Apache and port 80 is indeed running fine. Since Amazon starts instances in VPC, do i have to do anything there... Infact i cannot even ping the instance, although i can ssh to it! Any advice? EDIT: Note that i had edited /etc/hosts file earlier to make 389-ds (ldap) installation work. My /etc/hosts file looks like this(IP addresses as shown as w.x.y.z ) 127.0.0.1   localhost.localdomain localhost w.x.y.z   ip-w-x-y-z.us-west-1.compute.internal w.x.y.z   ip-w-x-y-z.localdomain

    Read the article

  • Are my web server permissions for uploading correct?

    - by user1699176
    I'm on debian and I have my website in the directory /srv/www/mysite.com/public_html I set chown for www-data:www-data on /srv/www. I have root disabled and created a sudo user which is id 1000:1000. I would also like to use this user to upload to /srv/www so I added my sudo user to the www-data group. I originally got a message saying that I didn't have permissions to upload a file to that directory. After playing around with multiple permissions for a while I finally was able to upload properly, but I'm not sure if this set up is correct. I'm hesitant to change it for now since it actually works, so I thought I'd ask for advice. I think what I ended up doing was this: sudo chown -R www-data:www-data /srv/www sudo chmod g+s /srv/www sudo usermod -aG www-data myuser sudo chgrp -R www-data /srv/www sudo chmod -R g+w /srv/www When I was finally able to successfully upload a file (with FileZilla) it showed the owner as myuser myuser. Shouldn't it have been www-data myuser? My question is whether this is correct and if there are any potential security issues? For example, I wasn't sure if I was actually supposed to use "myuser" to own the /srv/www directory instead sudo chown -R myuser:myuser /srv/www or maybe sudo chown -R www-data:myuser /srv/www If you need more info, let me know, thanks.

    Read the article

  • Hosing website on own server. What is Hardware requirement and Cost [closed]

    - by KuKu
    As i am about to finish my dream website, i need to host it on some server. I checked Amazon AWS Free Tier, i found it pretty complex. As i made full website in java(JSP + Servlet + mysql + Node.js), its been expensive to host. So i thought that why not to host on own server so that i will be fully dependent on my resources, not on any one else. And i know, in future i need to pay more and more to the other hosting company(because of uploading images and number of incoming users). So my question is, On initial stage what kind of hardware i will required. What can be the cost of that hardware? I already have 12MBPS broadband connection, will it be sufficient? It has static IP Address as well.

    Read the article

  • Deny IIS6 web request based on URL parameters?

    - by user21146
    I've got a legacy app running a third-party ecommerce system under IIS6. Some spammers recently discovered a bad security vulnerability in one of the store's forms, which are allowing them to send arbitrary emails from our system. Unfortunately, this store "feature" is built into the default.aspx page's code-behind and I have no way to disable it without shutting down the store. How can I filter out URL request with a given querystring parameter? ie, I want to filter out requests to: http://www.mysite.com/store/?id=SendSpam based on the "SendSpam" string.

    Read the article

  • How much visitors could handle my server ?

    - by coolboycsaba
    I have a website, and I want to host it on my own computer, but I'm wondering if it's good enough. The website check`s if the user is logged in and then displays 15 items (title, description) from a mysql database and the rating (stored in another database) and the comments (another database) for each item. It also displays some stats (number of items, comments). I also have an image for each item. My specs are: AMD Athlon 64 X2 Dual Core Processor 5600+ 2.90 GHz RAM memory: 4.00GB Windows 7 64bit So what do you think, how much visitors and items could it handle (at once or daily) ? My internet connection is good, around 7-10 mb upload and same download speed

    Read the article

  • how to fight back attacks on my web service

    - by user12145
    My apache webservice is getting a large quantity of requests over the days, each one with somewhat random login to gain access. I identified about 60 such ips(a few sample below), all belong to google. is there a way to find more information about the origin of the attacker? or should I just block these ips. secondly, should I attempt to block the identified ips subnets(74.125.46.*) as a preventive measure? 72.14.194.65 64.233.172.20 74.125.75.19 72.14.194.33 74.125.46.87 74.125.44.91 74.125.46.91

    Read the article

  • No internet through a web browser but Skype still works

    - by kim3er
    We get our office broadband through a BT Business Broadband WiFi router. We have a mixture of macs and windows PC/laptops connecting to it at any one time. All devices are able to connect to the wireless signal with pretty much full signal. However, only two computers (one windows, one os x) are able to consistently connect to the internet. The other three (one windows, two os x), while they can always connect to the wifi, exhibit one of three characteristics. No internet at all. Programs like Skype work, but no internet through a browser. Internet works, but with intermittent lag when switching between different sites. I'm assuming while trying to resolve different addresses. Ignoring point 1 for moment, my gut is telling me DNS. It is an up to 20MB line that usually gets to between 13MB and 15MB downstream. The router is capable of dealing with the amount of wireless devices that we're throwing at it. Has anyone got any suggestions for how I might further diagnose this problem (preferably in OS X)? Rich

    Read the article

< Previous Page | 372 373 374 375 376 377 378 379 380 381 382 383  | Next Page >