Search Results

Search found 3025 results on 121 pages for 'amazon ec2'.

Page 41/121 | < Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >

  • Which do I select - Windows Azure or Amazon EC2 - for hosting unmanaged C++ code?

    - by sharptooth
    We have a server solution written entirely in unmanaged Visual C++. It contains complicated methods for really heavy data processing. The whole thing contains millions lines of code, so rewritning it all in some other language is not an option. We could write some extra code or make isolated changes, but rewriting everything is out of the question. Now we'd like to put it on a cloud. Which platform do we choose - Amazon EC2 or Windows Azure - and why?

    Read the article

  • Replacing DropBox with: Amazon S3 + SSL + GPG/TrueCrypt + Mounting on OSX ??

    - by Matt Rogish
    So, right now we're using DropBox to share various data files around between approximately 10 Mac OS X systems. However, we already have an S3 account and everyone on the lowest DropBox plan of $10/mo seems too expensive. We'd like to avoid any kind of local storage (share a disk on a desktop or something) since we're a geographically distributed team). So, I am contemplating something that would allow us to replace DropBox with our own home-grown solution. We are all fairly technical people and/or smart enough to follow some steps, so if it's not as "user friendly" as DropBox we're all comfortable with that. There are plenty of docs out there that have bits and pieces of what I want but some of the tools don't seem to fit the requirements: Transport security via SSL to the bucket Encryption of bucket contents Bi-directional syncing Most of the scripts I can find on the internet use "duplicity" which appears to fail #1 (it doesn't look like duplicity supports SSL to S3 - the docs don't state but the protocol looks plain old http http://www.nongnu.org/duplicity/duplicity.1.html#sect6 ) Many scripts use gpg to encrypt files. This seems like it could work, however I have to make sure that each OSX client is able to use the same key to encrypt and decrypt files (key management is left to me to manage). FTP and other client-based apps don't seem to support this at all. Finally, most of the scripts use one-way replication, e.g. using Amazon S3 as a simple backup store. As we'd be using Amazon S3 as the "repository" they fail this one. Whew. So, I'd love a single tool that does this but after an exhaustive search I don't think one exists. In my mind, the magical tool would be some combination of TrueCrypt and rsync. I'd be happy just knowing which tools out there can fulfill my 3 requirements, after that I can stitch together the rest. Any thoughts? THANKS!

    Read the article

  • Replacing DropBox with: Amazon S3 + SSL + GPG/TrueCrypt + Mounting on OSX ??

    - by Matt Rogish
    So, right now we're using DropBox to share various data files around between approximately 10 Mac OS X systems. However, we already have an S3 account and everyone on the lowest DropBox plan of $10/mo seems too expensive. So, I am contemplating something that would allow us to replace DropBox with our own home-grown solution. We are all fairly technical people and/or smart enough to follow some steps, so if it's not as "user friendly" as DropBox we're all comfortable with that. There are plenty of docs out there that have bits and pieces of what I want but some of the tools don't seem to fit the requirements: Transport security via SSL to the bucket Encryption of bucket contents Bi-directional syncing Most of the scripts I can find on the internet use "duplicity" which appears to fail #1 (it doesn't look like duplicity supports SSL to S3 - the docs don't state but the protocol looks plain old http http://www.nongnu.org/duplicity/duplicity.1.html#sect6 ) Many scripts use gpg to encrypt files. This seems like it could work, however I have to make sure that each OSX client is able to use the same key to encrypt and decrypt files (key management is left to me to manage). Finally, most of the scripts use one-way replication, e.g. using Amazon S3 as a simple backup store. As we'd be using Amazon S3 as the "repository" they fail this one. Whew. So, I'd love a single tool that does this but after an exhaustive search I don't think one exists. I'd be happy just knowing which tools out there can fulfill my 3 requirements, after that I can stitch together the rest. Any thoughts? THANKS!

    Read the article

  • What is the correct approach i should use for an application that requires amazon S3 uploads and SimpleDB data management?

    - by Luis Oscar
    I am developing an application for iOS and that is going smoothly, the problem is that I am very new at server sided things. I am totally confused about how to correctly use Amazon Web Services for this purpose. What I want to do is very simple. I want my application to be able to query a servlet hosted in EC2 to be able to retrieve pictures and data based on some criteria from S3 and SImpleDB respectively. Also the application should be able to upload pictures into a S3 bucket and register the information in the SImpleDB. My main concerns are security and costs, So far i was using Amazon Token Vending Machine but I haven't been successful when trying to customize it, and while researching I discovered that on the long run it is very expensive. The ultimate goal is to handle a "social" picture service for my iOS application. Being able to register new users, authenticate these users. See what permissions they have to which pictures from the bucked. And all this without having to worry about Third party people from accessing the private pictures of my users. Sorry for this question but I am really clueless about how to handle this... I have tried reading many articles but all these server stuff looks very scary.

    Read the article

  • How to setup matlabpool for multiple processors?

    - by JohnIdol
    I just setup a Extra Large Heavy Computation EC2 instance to throw it at my Genetic Algorithms problem, hoping to speed up things. This instance has 8 Intel Xeon processors (around 2.4Ghz each) and 7 Gigs of RAM. On my machine I have an Intel Core Duo, and matlab is able to work with my two cores just fine by runinng: matlabpool open 2 On the EC2 instance though, matlab only is capable of detecting 1 out of 8 processors, and if I try running: matlabpool open 8 I get an error saying that the ClusterSize is 1 since there's only 1 core on my CPU. True, there is only 1 core on each CPU, but I have 8 CPUs on the given EC2 instance! So the difference from my machine and the ec2 instance is that I have my 2 cores on a single processor locally, while the EC2 instance has 8 distinct processors. My question is, how do I get matlab to work with those 8 processors? I found this paper, but it seems related to setting up matlab with multiple EC2 instances (not related to multiple processors on the same instance, EC2 or not), which is not my problem. Any help appreciated!

    Read the article

  • What is the best instance type to use for hosting a website on ec2?

    - by Josh
    Amazon offers two instance types on EC2: 1) On-Demand and 2) Reserved. After reading the docs on these, I don't really understand the difference from an end-user perspective. More specifically, I'd like to know the answer to this question: is one or the other better for web applications? Based on their names and descriptions, it seems as though on-demand instances may get wiped away from the server altogether if they're not in use which means that they need to be restarted when a request finally does come in. That seems like a pretty bad thing for a website. Am I just misinterpreting the docs? Thanks!

    Read the article

  • /etc/init.d Character Encoding Issue

    - by Ryan Rosario
    I have a script in /etc/init.d on an EC2 image that, on machine startup, pulls in source code via SVN, builds it, and then runs it using Ant. The source code is Java. Within this code is a call to the Weka library which writes a file to disk. On most Ubuntu AMIs, and my home machines' versions of Ubuntu, there is no issue. The problem is that with certain versions/AMIs of Ubuntu, Unicode characters in the file are replaced with question marks ('?'). If I run the job manually on the trouble instance, Unicode is output to file correctly, but not when run from /etc/init.d. What might be causing this problem and how can I fix it so that Unicode characters appear correctly in files written from /etc/init.d processes?

    Read the article

  • How to automatically restart Tomcat7 on system reboots?

    - by coding crow
    I have installed Tomcat 7 on Ubuntu 12.04 LTS which run on an Amzon EC2 instance. Now I wish tomcat should restart automatically on system reboot. I read this blog which suggest adding below script to /etc/init.d/tomcat7 # Tomcat auto-start # # description: Auto-starts tomcat # processname: tomcat # pidfile: /var/run/tomcat.pid case $1 in start) sh /usr/share/tomcat7/bin/startup.sh ;; stop) sh /usr/share/tomcat7/bin/shutdown.sh ;; restart) sh /usr/share/tomcat7/bin/shutdown.sh sh /usr/share/tomcat7/bin/startup.sh ;; esac exit 0 and issue the following commands sudo chmod 755 /etc/init.d/tomcat7 sudo ln -s /etc/init.d/tomcat7 /etc/rc1.d/K99tomcat sudo ln -s /etc/init.d/tomcat7 /etc/rc2.d/S99tomcat sudo /etc/init.d/tomcat7 restart My Questions The tomcat7 already has script in it, where do we have to paste the suggested script? Is the suggested procedure correct?

    Read the article

  • Lost sudo/su on Amazon EC2 instance

    - by barrycarter
    I have an Amazon EC2 instance. I can login just fine, but neither "su" nor "sudo" work now (they worked fine previously): "su" requests a password, but I login using ssh keys, and I don't think the root user even has a password. "sudo <anything>" does this: sudo: /etc/sudoers is owned by uid 222, should be 0 sudo: no valid sudoers sources found, quitting I probably did "chown ec2-user /etc/sudoers" (or, more likely "chown -R ec2-user /etc" because I was sick of rsync failing), so this is my fault. How do I recover? I stopped the instance and tried the "View/Change User Data" option on the AWS EC2 console, but this didn't help. EDIT: I realize I could kill this instance and create a new one, but was hoping to avoid something that extreme.

    Read the article

  • SimpleDB to ActiveResource. Rails

    - by Victor P
    Im looking for a way to map an ActiveResource to SimpleDB I want to avoid plugins/gems as all I have used are outdated/buggy/not mantained It doesnt seem hard, I wonder if any of you have succesfully implemented a rails app with simpleDB as an Active Resource. How did you do it? Thanks.

    Read the article

  • Requires a valid Date or x-amz-date header?

    - by Jordan Messina
    I'm getting the following error when attempting to upload a file to S3: S3StorageError: <?xml version="1.0" encoding="UTF-8"?> <Error><Code>AccessDenied</Code><Message>AWS authentication requires a valid Date or x-amz-date header</Message><RequestId>7910FF83F3FE17E2</RequestId><HostId>EjycXTgSwUkx19YNkpAoY2UDDur/0d5SMvGJUicpN6qCZFa2OuqcpibIR3NJ2WKB</HostId></Error> I'm using Django with Django-Storages and Imagekit My S3 settings in my settings.py looks as follows: locale.setlocale(locale.LC_TIME, 'en_US') DEFAULT_FILE_STORAGE = 'backends.s3.S3Storage' AWS_ACCESS_KEY_ID = '************************' AWS_SECRET_ACCESS_KEY = '*****************************' AWS_STORAGE_BUCKET_NAME = 'static.blabla.com' AWS_HEADERS = { 'x-amz-date': datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT'), 'Expires': 'Thu, 15 Apr 2200 20:00:00 GMT', } from S3 import CallingFormat AWS_CALLING_FORMAT = CallingFormat.SUBDOMAIN Thanks for any help you can give!

    Read the article

  • Retrieve binary data from S3 storage through AWS.NET in C#

    - by BerggreenDK
    I've tested most of the included samples in the AWS SDK for .NET and they all works fine. I can PUT objects, LIST objects and DELETE objects in a bucket, but... lets say I delete the original and want to sync those files missing locally? I would like to make a GET object (by key/name and bucket ofcause). I can find the object, but how do I read the binary data from S3 through the API? Do I have to write my own SOAP wrapper for this or is there some kinda sample for this out "here" ? :o) In hope of a sample. It does not have to tollerate execeptions etc. I just need to see the main parts that connects, retreives and stores the file back on my ASP.net or C# project. Anyone???

    Read the article

  • Setting up Sendgrid using AWS

    - by user2793297
    I currently have a site where the data is hosted on Firebase and the static files are hosted on AWS (I registered my domain using NameCheap, but am routing to AWS using Route 53 and S3). I now want to use SendGrid to send emails, but they are saying that I need to set up an SMTP server. I can't find anywhere what the best way to do this is. Can I get suggestions please on the best solution? I want to use SendGrid to send transactional email such as "Welcome to the site!", "Forgot Password", etc.

    Read the article

  • Image upload and Manipulation in Django

    - by Saransh Mohapatra
    I am trying upload images and than create an thumbnail of it and than store both in S3. After the file has been uploaded i am first uploading it to S3 and than trying to create thumbnail but it doesn't work as than PIL is not able to recognise the image. And secondly if I create the thumbnail first than while uploading original image I get EOF. I think Django allows just once for the uploaded files to be used only once....Please kindly tell me a way to do so....Thanks in advance

    Read the article

  • Using IAM for user authentication

    - by mdavis6890
    I've read lots and lots of posts that touch on what I think should be a very common use case - but without finding exactly what I want, or a simple reason why it can't be done. I have some files on S3. I want to be able to grant certain users access to certain files, via a front end that I build. So far, I've made it work this way: I built the front end in Django, using it's built-in Users and Groups I have a model for Buckets, in which I mirror my S3 buckets. I have a m2m relationship from groups to buckets representing the S3 permissions. The user logs in and authenticates against Django's users. I grab from Django the list of buckets that the user is allowed to see I use boto to grab a list of links to files from those buckets and display to user. This works, but isn't ideal, and also just doesn't feel right. I've got to keep a mirror of the buckets, and I also have to maintain my own list of user/passwords and permissions, when AWS already has all that built in. What I really want is to simply create the users in IAM and use group permissions in IAM to control access to the S3 buckets. No duplication of data or function. My app would request a UN/PW from the user and use that to connect to IAM/S3 to pull the list of buckets and files, then display links to the user. Simple. How can I, or why can't I? Am I looking at this the wrong way? What's the "right" way to address this (I assume) very common use case?

    Read the article

  • AWS for Android SDK host name error

    - by feelingtheblanks
    I'm trying to upload to AWS S3 by using thier AWS for Android SDK but both sample project within SDK and my project give the following error on devices while emulator runs without problem. So there's no problem with my AWS account. "Host name may not be null." Upload Code : s3Client.createBucket(Constants.getBucket()); PutObjectRequest por = new PutObjectRequest(Constants.getBucket(), record.getFile().getName(), record.getFile()); s3Client.putObject(por); Any help is appreciated.

    Read the article

  • Using AWS S3 for photo storage

    - by Sam
    I'm going to be using S3 to store user uploaded photos. Obviously, I wont be serving the image files to user agents without resizing them down. However, not one size would do, as some thumbnails will be smaller than other larger previews. So, I was thinking of making a standard set of dimensions scaling from the lowest 16x16 to some highest 1024x1024. Is this a good way to solve this problem? What if I need a new size later on? How would you solve this?

    Read the article

  • Uploading Files to AWS S3 from an Android App

    - by Abhishek Kaushik
    Edited 7th June,14 My Android app needs to have a feature where clients can upload their files. I want AWS S3 as my storage. Moreover i dont want to use SECRET_KEY and ACCESS_KEY_ID on client side. What is the the best way to do this. Can someone provide the working code too ? I read that i can request to AWS for a signed URL and then make client directly upload to that URL. How to achieve this ?

    Read the article

  • Any need to make backup of data on Amazon S3?

    - by Chrille
    I'm hosting 200 GB of product images at S3 (this is my primary file host). Do I need to back that data up somewhere else, or is S3 safe as it is? I have been experimenting with mounting the S3 bucket to a EC2 instance, and then making a nightly rsync backup. The problem is that it's about 3 million files, so it takes a while to generate the different rsync needs. The backup actually takes about 3 days to complete. Any ideas how to do this better? (if it's even necessary?)

    Read the article

  • virturalmin webmin dose not respond

    - by Miranda
    I have installed Virtualmin on a CentOS remote server, but it dose not seem to work https://115.146.95.118:10000/ at least the Webmin page dose not work. I have opened those ports http ALLOW 80:80 from 0.0.0.0/0 ALLOW 443:443 from 0.0.0.0/0 ssh ALLOW 22:22 from 0.0.0.0/0 virtualmin ALLOW 20000:20000 from 0.0.0.0/0 ALLOW 10000:10009 from 0.0.0.0/0 And restarting Webmin dose not solve it: /etc/rc.d/init.d/webmin restart Stopping Webmin server in /usr/libexec/webmin Starting Webmin server in /usr/libexec/webmin And I have tried to use Amazon EC2 this time, still couldn't get it to work. http://ec2-67-202-21-21.compute-1.amazonaws.com:10000/ [ec2-user@ip-10-118-239-13 ~]$ netstat -an | grep :10000 tcp 0 0 0.0.0.0:10000 0.0.0.0:* LISTEN udp 0 0 0.0.0.0:10000 0.0.0.0:* [ec2-user@ip-10-118-239-13 ~]$ sudo iptables -L -n Chain INPUT (policy ACCEPT) target prot opt source destination ACCEPT udp -- 0.0.0.0/0 0.0.0.0/0 udp dpt:20 ACCEPT udp -- 0.0.0.0/0 0.0.0.0/0 udp dpt:21 ACCEPT udp -- 0.0.0.0/0 0.0.0.0/0 udp dpt:53 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:20000 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:10000 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:443 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:80 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:993 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:143 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:995 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:110 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:20 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:21 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:53 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:587 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:25 ACCEPT tcp -- 0.0.0.0/0 0.0.0.0/0 tcp dpt:22 Chain FORWARD (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination Since I need more than 10 reputation to post image, you can find the screenshots of the security group setting at the Webmin Support Forum. I have tried: sudo iptables -A INPUT -p tcp -m tcp --dport 10000 -j ACCEPT It did not change anything. [ec2-user@ip-10-118-239-13 ~]$ sudo yum install openssl perl-Net-SSLeay perl-Crypt-SSLeay Loaded plugins: fastestmirror, priorities, security, update-motd Loading mirror speeds from cached hostfile * amzn-main: packages.us-east-1.amazonaws.com * amzn-updates: packages.us-east-1.amazonaws.com amzn-main | 2.1 kB 00:00 amzn-updates | 2.3 kB 00:00 Setting up Install Process Package openssl-1.0.0j-1.43.amzn1.i686 already installed and latest version Package perl-Net-SSLeay-1.35-9.4.amzn1.i686 already installed and latest version Package perl-Crypt-SSLeay-0.57-16.4.amzn1.i686 already installed and latest version Nothing to do [ec2-user@ip-10-118-239-13 ~]$ nano /etc/webmin/miniserv.conf GNU nano 2.0.9 File: /etc/webmin/miniserv.conf port=10000 root=/usr/libexec/webmin mimetypes=/usr/libexec/webmin/mime.types addtype_cgi=internal/cgi realm=Webmin Server logfile=/var/webmin/miniserv.log errorlog=/var/webmin/miniserv.error pidfile=/var/webmin/miniserv.pid logtime=168 ppath= ssl=1 env_WEBMIN_CONFIG=/etc/webmin env_WEBMIN_VAR=/var/webmin atboot=1 logout=/etc/webmin/logout-flag listen=10000 denyfile=\.pl$ log=1 blockhost_failures=5 blockhost_time=60 syslog=1 session=1 server=MiniServ/1.585 userfile=/etc/webmin/miniserv.users keyfile=/etc/webmin/miniserv.pem passwd_file=/etc/shadow passwd_uindex=0 passwd_pindex=1 passwd_cindex=2 passwd_mindex=4 passwd_mode=0 preroot=virtual-server-theme passdelay=1 sessiononly=/virtual-server/remote.cgi preload= mobile_preroot=virtual-server-mobile mobile_prefixes=m. mobile. anonymous=/virtualmin-mailman/unauthenticated=anonymous ssl_cipher_list=ECDHE-RSA-AES256-SHA384:AES256-SHA256:AES256-SHA256:RC4:HIGH:MEDIUM:+TLSv1:!MD5:!SSLv2:+SSLv3:!ADH:!aNULL:!eNULL:!NULL:!DH:!ADH:!EDH:!AESGCM

    Read the article

  • How can I tell if my Amazon Windows instance was an SQL Server AMI?

    - by Aligma
    I want to purchase some reserved instances, because I have several instances already created and running 24 hours a day. When I go to purchase a Windows instance, I can see 3 options, Windows Windows with SQL Server Standard Windows with SQL server Web I don't know which of these was used to create the original instance. Is there a way I can find out? My assumptions: the instance type is is important because as far as I understand, the way to purchase a reserved instance is to first have a running instance, and then purchase a matching reserved instance. The reserved instance is not itself a new machine, but a kind of contract between you and Amazon to pay for an instance for 1 or 3 years, at a discounted rate. The contracted, reserved instance will "offset" one matching running instance where they have the same size and platform. Please feel free to correct me if these assumptions are incorrect.

    Read the article

  • What SDK I should choose on Amazon Web Service to build API on server app?

    - by Nguyen Minh Binh
    I am a newbie on Amazon Web Service. I have a task that setup then build a web service that provide APIs to Mac OS, iOS, Android client. There are some APIs and Database need to be kept in secure. I see that AWS support multiple platform such as Java, .Net, PHP,... It also support many Database Management System. Not yet, there are 2 special SDK for Android and iOS app. So, What should I choose (Java, .Net, PHP,...) to carry out my task? Does AWS support all webservice protocol? Does it support secure webservice? Thanks a lot.

    Read the article

  • Is it possible to choose the Amazon Glacier Vault when backing up from Synology NAS?

    - by Jorrit Reedijk
    I am using Amazon Glacier Backup on a Synology NAS (rackstation). But the Synology backup software for glacier does not allow me to choose the vault to backup to. This means I can not give my own name to the vault, but also I can not backup to a previously created vault (I also have this problem now after replacing a Synology rackstation). Is there any way to choose to which vault you want your Synology to backup to? Edit: I have now posted this question to the Synology forum also: http://forum.synology.com/enu/viewtopic.php?f=174&t=86557&e=0 If I can solve the problem I'll post the solutions here also.

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >