Search Results

Search found 4473 results on 179 pages for 'mason cloud'.

Page 52/179 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • How to evaluate "enterprise" platforms?

    - by Ran Biron
    Hi all, I'm tasked with evaluating an "enterprise" platform for the next-gen version of a product. We're currently considering two "types" of platforms - RAD (workflow engine, integrated UI, small cores of "technology plugins" to the workflows, automatic persisting of state...) like SalesForce.com / Service-Now.com and "cloud based" (EC2 / AppEngine...). While I have a few ideas on where to start, I'd like your opinions - how would you evaluate platforms for an enterprise suite of products? What factors would you consider? How would you eliminate weak options quickly enough to be able to concentrate on the few strong ones? Also interesting is how would you compare enterprise RAD (proven technology, quick to develop - but tends to look "the same as the competition") to cloud-based technology (lots of "buzz", not that many competitors - easy to justify to management, but probably lacking (?) enterprise tools and experience). As said before - I have a few ideas, but would like to see some answers before I post mine so I wouldn't drive the discussion to a specific place. RB.

    Read the article

  • Javascript - use to load xml data from URL

    - by spenf
    I have this url with some xml data in it: http://64.182.231.116/~spencerf/union_college/Upperclass_Sample_Menu.xml And I would like to load this xml data into my javascript script so I can parse it. I am using parse.com Javascript SDK, in there cloud code. Here is the code I have tried: Parse.Cloud.define("next", function(request, response) { response.success("Hello world!"); $.ajax({ url: 'http://64.182.231.116/~spencerf/union_college/Upperclass_Sample_Menu.xml', // name of file you want to parse dataType: "xml", // type of file you are trying to read success: parse, // name of the function to call upon success error: function(){alert("Error: Something went wrong");} }); }); But when I run this I get an error: $ is not defined at main.js:

    Read the article

  • Benchmarking hosting providers IO with Bonnie

    - by Derek Organ
    Ok, because of a bunch of projects I'm working on I've access to dedicated Servers on a 3 hosting providers. As an experiment and for educational purposes I decided to see if I could benchmark how good the IO is with each. Bit of research lead me to Bonnie++ So I installed it on the server and ran this simple command /usr/sbin/bonnie -d /tmp/foo The 3 machines in different hosting providers are all dedicated machines, one is a VPS, other two are on some cloud platform e.g. VMWare / Xen using some kind of clustered SAN for storage This might be a naive thing to do but here are the results I found. HOST A Version 1.03c ------Sequential Output------ --Sequential Input- --Random- -Per Chr- --Block-- -Rewrite- -Per Chr- --Block-- --Seeks-- Machine Size K/sec %CP K/sec %CP K/sec %CP K/sec %CP K/sec %CP /sec %CP xxxxxxxxxxxxxxxx 1G 45081 88 56244 14 19167 4 20965 40 67110 6 67.2 0 ------Sequential Create------ --------Random Create-------- -Create-- --Read--- -Delete-- -Create-- --Read--- -Delete-- files /sec %CP /sec %CP /sec %CP /sec %CP /sec %CP /sec %CP 16 15264 28 +++++ +++ +++++ +++ +++++ +++ +++++ +++ +++++ +++ xxxxxxxx,1G,45081,88,56244,14,19167,4,20965,40,67110,6,67.2,0,16,15264,28,+++++,+++,+++++,+++,+++++,+++,+++++,+++,+++++,+++ HOST B Version 1.03d ------Sequential Output------ --Sequential Input- --Random- -Per Chr- --Block-- -Rewrite- -Per Chr- --Block-- --Seeks-- Machine Size K/sec %CP K/sec %CP K/sec %CP K/sec %CP K/sec %CP /sec %CP xxxxxxxxxxxx 4G 43070 91 64510 15 19092 0 29276 47 39169 0 448.2 0 ------Sequential Create------ --------Random Create-------- -Create-- --Read--- -Delete-- -Create-- --Read--- -Delete-- files /sec %CP /sec %CP /sec %CP /sec %CP /sec %CP /sec %CP 16 24799 52 +++++ +++ +++++ +++ 25443 54 +++++ +++ +++++ +++ xxxxxxx,4G,43070,91,64510,15,19092,0,29276,47,39169,0,448.2,0,16,24799,52,+++++,+++,+++++,+++,25443,54,+++++,+++,+++++,+++ HOST C Version 1.03c ------Sequential Output------ --Sequential Input- --Random- -Per Chr- --Block-- -Rewrite- -Per Chr- --Block-- --Seeks-- Machine Size K/sec %CP K/sec %CP K/sec %CP K/sec %CP K/sec %CP /sec %CP xxxxxxxxxxxxx 1536M 15598 22 85698 13 258969 20 16194 22 723655 21 +++++ +++ ------Sequential Create------ --------Random Create-------- -Create-- --Read--- -Delete-- -Create-- --Read--- -Delete-- files /sec %CP /sec %CP /sec %CP /sec %CP /sec %CP /sec %CP 16 14142 22 +++++ +++ 18621 22 13544 22 +++++ +++ 17363 21 xxxxxxxx,1536M,15598,22,85698,13,258969,20,16194,22,723655,21,+++++,+++,16,14142,22,+++++,+++,18621,22,13544,22,+++++,+++,17363,21 Ok, so first off what is the best way to read the figures and are there any issues with really comparing these numbers? Is this in any way a true representation of IO Speed? If not is there any way for me to test that? Note: these 3 machines are using either Ubuntu or Debian (I presume that doesn't really matter)

    Read the article

  • Postfix issues sending mail to addresses under domain located on server

    - by iamthewit
    I recently installed virtualmin on my nice shiny new rackspace cloud. Everything went seemlessly but I've been having some issues getting emails to send properly. The problem seems to be that the server can not send mail to email addresses where the domain is owned by my server. For example, on my server I run multiple virtual domains, lets call this one test.com. When I run the mail command from shell (mail [email protected]) I get the following back from my maillog: Oct 6 14:55:18 test postfix/pickup[8737]: DC1131612CC: uid=0 from= Oct 6 14:55:18 test postfix/cleanup[8769]: DC1131612CC: [email protected] Oct 6 14:55:18 test postfix/qmgr[8738]: DC1131612CC: [email protected], size=353, nrcpt=1 (queue active) Oct 6 14:55:18 test postfix/error[8771]: DC1131612CC: [email protected], relay=none, delay=0, delays=0/0/0/0, dsn=5.0.0, status=bounced (User unknown in virtual alias table) Oct 6 14:55:18 test postfix/cleanup[8769]: DD07D1612D1: [email protected] Oct 6 14:55:18 test postfix/bounce[8772]: DC1131612CC: sender non-delivery notification: DD07D1612D1 Oct 6 14:55:18 test postfix/qmgr[8738]: DD07D1612D1: from=<, size=2268, nrcpt=1 (queue active) Oct 6 14:55:18 test postfix/qmgr[8738]: DC1131612CC: removed Oct 6 14:55:18 test postfix/local[8773]: DD07D1612D1: [email protected], relay=local, delay=0.03, delays=0/0/0/0.03, dsn=2.0.0, status=sent (delivered to command: /usr/bin/procmail-wrapper -o -a $DOMAIN -d $LOGNAME) Oct 6 14:55:18 test postfix/qmgr[8738]: DD07D1612D1: removed when I run mail [email protected] the message is sent and received perfectly fine. I'm a bit of a noob when it comes to servers, but I pick things up fairly quickly, so please excuse any incorrect terminology and my general noobiness. Any help would be greatly appreciated, I've been googling for quite a while but I haven't found a solution yet, I'll add a copy of my main.cf file in a response below cheers guys here is the reformatted postconf, do you want the reformatted main.cf file too, or is this enough? alias_database = hash:/etc/postfix/aliases alias_maps = hash:/etc/postfix/aliases broken_sasl_auth_clients = yes command_directory = /usr/sbin config_directory = /etc/postfix daemon_directory = /usr/libexec/postfix debug_peer_level = 2 home_mailbox = Maildir/ html_directory = no mailbox_command = /usr/bin/procmail-wrapper -o -a $DOMAIN -d $LOGNAME mailq_path = /usr/bin/mailq.postfix manpage_directory = /usr/share/man myhostname = server.test.com newaliases_path = /usr/bin/newaliases.postfix readme_directory = /usr/share/doc/postfix-2.3.3/README_FILES sample_directory = /usr/share/doc/postfix-2.3.3/samples sender_bcc_maps = hash:/etc/postfix/bcc sendmail_path = /usr/sbin/sendmail.postfix setgid_group = postdrop smtpd_recipient_restrictions = permit_mynetworks permit_sasl_authenticated reject_unauth_destination smtpd_sasl_auth_enable = yes smtpd_sasl_security_options = noanonymous unknown_local_recipient_reject_code = 550 virtual_alias_maps = hash:/etc/postfix/virtual

    Read the article

  • links for 2011-01-06

    - by Bob Rhubart
    Coming to your town: Oracle Enterprise Cloud Summit During these full-day events, cloud experts will share real-world best practices, reference architectures, detailed customer case studies, and more. Events scheduled in cities around the world.  (tags: oracle otn cloud event) Webcast: Security and Compliance for Private Cloud Consolidation Roxana Bradescu, Senior Director for Oracle Database Security Products, discusses Oracle Database Security Solutions to securely consolidate data and meet compliance requirements within private cloud computing environments. Thursday, January 13, 2011. 10am PST | 1pm EST (tags: oracle cloud security) Answering Questions about Mobile Devices | The AppsLab "How do the numbers of Android and iOS users compare? How often are people switching? Where are all these BlackBerry and Nokia users? Do they plan to jump to Android or iOS? What about webOS? Is it relevant?" Some answers in this AppsLab survey. (tags: oracle otn enterprise2.0 mobilecomputing iphone blackberry android) Webcast: Achieve 24/7 Cloud Availability Without Expensive Redundancy Ashish Ray and Matthew Baier discuss Oracle’s Maximum Availability Architecture and Oracle Database 11g. (tags: oracle cloud highavailability webcast) Converting a PV vm back into an HVM vm (Wim Coekaerts Blog) "I wanted to convert one of my VMs that was based on a paravirt kernel into a vm that just boots as a regular hardware virt VM with a standard x86-64 kernel...It took me a little while to figure out the fastest way so now that I have it pretty much down I wanted to share the steps." - Wim Coekaerts (tags: oracle otn virtualization oraclevm) @OTN_Garage: Resources for VirtualBox 4.0 Rick "@OTN_Garage" Ramsey shares links to several resources for those with a VirtualBox jones. (tags: oracle otn virtualization virtualbox) 'Federal Service Bus' Helps Belgian Government Speak a Common Language - SOA in Action Blog "The first SOA-enabled application was developed in less than two months and was fully operational in approximately 10 weeks. In addition, new FSB modules are reusable for other Belgian e-government applications, saving both time and taxpayer dollars." - Joe McKendrick (tags: soa oracle) Show Notes: Architects in the Cloud (ArchBeat Podcast) The complete 4-part interview with Stephen G. Bennett and Archie Reed, the authors of "Silver Clouds, Dark Linings: A Concise Guide to Cloud Computing," is now available. (tags: oracle otn cloud podcast archbeat)

    Read the article

  • Where should I go with hosting my site: VPS, GAE, another option?

    - by Jonathan Hayward
    My website, http://JonathansCorner.com/, began life before 1994 as www.imsa.edu/~jhayward/ and has been through various iterations and improvements to content, HTML, and the like, but remains a literature site that is from a web administrator's perspective fairly simple and primitive: a fair amount of static HTML and supporting files, a little bit of CGI and URI rewriting, .htaccess files providing Expires: headers and the like. An associated site demoes various CGI scripts that fall under the category of "and other creations"; the site as a whole has the purpose of sharing my creative works, and so far a fairly rudimentary use of Apache functionality, supported by Unix tools to, for instance, update RSS feed and the "starting point" link on the home page, has served that purpose fairly well. I looked around here on web hosting, and found the note on web host reccommendations as a good note for "What are some of people's favorite web hosts overall," but I wanted to ask a more focused question of "What are the best web hosts for criteria XYZ:" I am looking at a VPS so I will have root, be able to install stuff and edit Apache's config files etc., running Gentoo or other Linux, BSD, or the like. I would like a system that is secure enough that the host's vulnerabilities are mostly the ones that come along with what I am trying to do: that is, I won't be trying to administer and secure an ancient Linux like some have complained about at 1and1. I would like good uptime/reliability and competent support staff: if the level 1 help desk is going to tell me to go to "My Computer" on a Linux box, I'd like to be able to get past them. Ideally I would like a site hosted within some place that will have low latency for U.S. visitors in particular. I would like a hosting solution that will be with a stable business, one that will probably be around, and one unlikely to vanish without warning. With those things specified, I would be interested in knowing what are the less expensive options. (I expect that some of the things I've specified will knock out all of the cheapest options, but I'm still interested in price.) With all that stated, I'd like to back up a bit and look at whether I am asking the right question. I am concerned that the above is a very good way of asking, "How can I keep my site in line with the wave of the past?" I am wondering if it might be specifically wiser to look to adapt my site to newer technologies instead of trying to keep it on older technologies. For instance, while I would hardly portray my site as a way to show off the full power of Google App Engine, the main site at least should be a straightforward port if I were to do that. And beyond Google App Engine, my knowledge of cloud solutions is basic. If it is a better and more future-proof solution to port my site to another kind of solution, I would be interested in knowing where those future-proof solutions lie. So I would be interested in wisdom. If the question I asked in detail is still a good question to be asking, what would people suggest? Or if I should seriously consider porting my site to a newer basic option, what should I try there? Any thoughts would be appreciated.

    Read the article

  • Announcing the release of the Windows Azure SDK 2.1 for .NET

    - by ScottGu
    Today we released the v2.1 update of the Windows Azure SDK for .NET.  This is a major refresh of the Windows Azure SDK and it includes some great new features and enhancements. These new capabilities include: Visual Studio 2013 Preview Support: The Windows Azure SDK now supports using the new VS 2013 Preview Visual Studio 2013 VM Image: Windows Azure now has a built-in VM image that you can use to host and develop with VS 2013 in the cloud Visual Studio Server Explorer Enhancements: Redesigned with improved filtering and auto-loading of subscription resources Virtual Machines: Start and Stop VM’s w/suspend billing directly from within Visual Studio Cloud Services: New Emulator Express option with reduced footprint and Run as Normal User support Service Bus: New high availability options, Notification Hub support, Improved VS tooling PowerShell Automation: Lots of new PowerShell commands for automating Web Sites, Cloud Services, VMs and more All of these SDK enhancements are now available to start using immediately and you can download the SDK from the Windows Azure .NET Developer Center.  Visual Studio’s Team Foundation Service (http://tfs.visualstudio.com/) has also been updated to support today’s SDK 2.1 release, and the SDK 2.1 features can now be used with it (including with automated builds + tests). Below are more details on the new features and capabilities released today: Visual Studio 2013 Preview Support Today’s Window Azure SDK 2.1 release adds support for the recent Visual Studio 2013 Preview. The 2.1 SDK also works with Visual Studio 2010 and Visual Studio 2012, and works side by side with the previous Windows Azure SDK 1.8 and 2.0 releases. To install the Windows Azure SDK 2.1 on your local computer, choose the “install the sdk” link from the Windows Azure .NET Developer Center. Then, chose which version of Visual Studio you want to use it with.  Clicking the third link will install the SDK with the latest VS 2013 Preview: If you don’t already have the Visual Studio 2013 Preview installed on your machine, this will also install Visual Studio Express 2013 Preview for Web. Visual Studio 2013 VM Image Hosted in the Cloud One of the requests we’ve heard from several customers has been to have the ability to host Visual Studio within the cloud (avoiding the need to install anything locally on your computer). With today’s SDK update we’ve added a new VM image to the Windows Azure VM Gallery that has Visual Studio Ultimate 2013 Preview, SharePoint 2013, SQL Server 2012 Express and the Windows Azure 2.1 SDK already installed on it.  This provides a really easy way to create a development environment in the cloud with the latest tools. With the recent shutdown and suspend billing feature we shipped on Windows Azure last month, you can spin up the image only when you want to do active development, and then shut down the virtual machine and not have to worry about usage charges while the virtual machine is not in use. You can create your own VS image in the cloud by using the New->Compute->Virtual Machine->From Gallery menu within the Windows Azure Management Portal, and then by selecting the “Visual Studio Ultimate 2013 Preview” template: Visual Studio Server Explorer: Improved Filtering/Management of Subscription Resources With the Windows Azure SDK 2.1 release you’ll notice significant improvements in the Visual Studio Server Explorer. The explorer has been redesigned so that all Windows Azure services are now contained under a single Windows Azure node.  From the top level node you can now manage your Windows Azure credentials, import a subscription file or filter Server Explorer to only show services from particular subscriptions or regions. Note: The Web Sites and Mobile Services nodes will appear outside the Windows Azure Node until the final release of VS 2013. If you have installed the ASP.NET and Web Tools Preview Refresh, though, the Web Sites node will appear inside the Windows Azure node even with the VS 2013 Preview. Once your subscription information is added, Windows Azure services from all your subscriptions are automatically enumerated in the Server Explorer. You no longer need to manually add services to Server Explorer individually. This provides a convenient way of viewing all of your cloud services, storage accounts, service bus namespaces, virtual machines, and web sites from one location: Subscription and Region Filtering Support Using the Windows Azure node in Server Explorer, you can also now filter your Windows Azure services in the Server Explorer by the subscription or region they are in.  If you have multiple subscriptions but need to focus your attention to just a few subscription for some period of time, this a handy way to hide the services from other subscriptions view until they become relevant. You can do the same sort of filtering by region. To enable this, just select “Filter Services” from the context menu on the Windows Azure node: Then choose the subscriptions and/or regions you want to filter by. In the below example, I’ve decided to show services from my pay-as-you-go subscription within the East US region: Visual Studio will then automatically filter the items that show up in the Server Explorer appropriately: With storage accounts and service bus namespaces, you sometimes need to work with services outside your subscription. To accommodate that scenario, those services allow you to attach an external account (from the context menu). You’ll notice that external accounts have a slightly different icon in server explorer to indicate they are from outside your subscription. Other Improvements We’ve also improved the Server Explorer by adding additional properties and actions to the service exposed. You now have access to most of the properties on a cloud service, deployment slot, role or role instance as well as the properties on storage accounts, virtual machines and web sites. Just select the object of interest in Server Explorer and view the properties in the property pane. We also now have full support for creating/deleting/update storage tables, blobs and queues from directly within Server Explorer.  Simply right-click on the appropriate storage account node and you can create them directly within Visual Studio: Virtual Machines: Start/Stop within Visual Studio Virtual Machines now have context menu actions that allow you start, shutdown, restart and delete a Virtual Machine directly within the Visual Studio Server Explorer. The shutdown action enables you to shut down the virtual machine and suspend billing when the VM is not is use, and easily restart it when you need it: This is especially useful in Dev/Test scenarios where you can start a VM – such as a SQL Server – during your development session and then shut it down / suspend billing when you are not developing (and no longer be billed for it). You can also now directly remote desktop into VMs using the “Connect using Remote Desktop” context menu command in VS Server Explorer.  Cloud Services: Emulator Express with Run as Normal User Support You can now launch Visual Studio and run your cloud services locally as a Normal User (without having to elevate to an administrator account) using a new Emulator Express option included as a preview feature with this SDK release.  Emulator Express is a version of the Windows Azure Compute Emulator that runs a restricted mode – one instance per role – and it doesn’t require administrative permissions and uses 40% less resources than the full Windows Azure Emulator. Emulator Express supports both web and worker roles. To run your application locally using the Emulator Express option, simply change the following settings in the Windows Azure project. On the shortcut menu for the Windows Azure project, choose Properties, and then choose the Web tab. Check the setting for IIS (Internet Information Services). Make sure that the option is set to IIS Express, not the full version of IIS. Emulator Express is not compatible with full IIS. On the Web tab, choose the option for Emulator Express. Service Bus: Notification Hubs With the Windows Azure SDK 2.1 release we are adding support for Windows Azure Notification Hubs as part of our official Windows Azure SDK, inside of Microsoft.ServiceBus.dll (previously the Notification Hub functionality was in a preview assembly). You are now able to create, update and delete Notification Hubs programmatically, manage your device registrations, and send push notifications to all your mobile clients across all platforms (Windows Store, Windows Phone 8, iOS, and Android). Learn more about Notification Hubs on MSDN here, or watch the Notification Hubs //BUILD/ presentation here. Service Bus: Paired Namespaces One of the new features included with today’s Windows Azure SDK 2.1 release is support for Service Bus “Paired Namespaces”.  Paired Namespaces enable you to better handle situations where a Service Bus service namespace becomes unavailable (for example: due to connectivity issues or an outage) and you are unable to send or receive messages to the namespace hosting the queue, topic, or subscription. Previously,to handle this scenario you had to manually setup separate namespaces that can act as a backup, then implement manual failover and retry logic which was sometimes tricky to get right. Service Bus now supports Paired Namespaces, which enables you to connect two namespaces together. When you activate the secondary namespace, messages are stored in the secondary queue for delivery to the primary queue at a later time. If the primary container (namespace) becomes unavailable for some reason, automatic failover enables the messages in the secondary queue. For detailed information about paired namespaces and high availability, see the new topic Asynchronous Messaging Patterns and High Availability. Service Bus: Tooling Improvements In this release, the Windows Azure Tools for Visual Studio contain several enhancements and changes to the management of Service Bus messaging entities using Visual Studio’s Server Explorer. The most noticeable change is that the Service Bus node is now integrated into the Windows Azure node, and supports integrated subscription management. Additionally, there has been a change to the code generated by the Windows Azure Worker Role with Service Bus Queue project template. This code now uses an event-driven “message pump” programming model using the QueueClient.OnMessage method. PowerShell: Tons of New Automation Commands Since my last blog post on the previous Windows Azure SDK 2.0 release, we’ve updated Windows Azure PowerShell (which is a separate download) five times. You can find the full change log here. We’ve added new cmdlets in the following areas: China instance and Windows Azure Pack support Environment Configuration VMs Cloud Services Web Sites Storage SQL Azure Service Bus China Instance and Windows Azure Pack We now support the following cmdlets for the China instance and Windows Azure Pack, respectively: China Instance: Web Sites, Service Bus, Storage, Cloud Service, VMs, Network Windows Azure Pack: Web Sites, Service Bus We will have full cmdlet support for these two Windows Azure environments in PowerShell in the near future. Virtual Machines: Stop/Start Virtual Machines Similar to the Start/Stop VM capability in VS Server Explorer, you can now stop your VM and suspend billing: If you want to keep the original behavior of keeping your stopped VM provisioned, you can pass in the -StayProvisioned switch parameter. Virtual Machines: VM endpoint ACLs We’ve added and updated a bunch of cmdlets for you to configure fine-grained network ACL on your VM endpoints. You can use the following cmdlets to create ACL config and apply them to a VM endpoint: New-AzureAclConfig Get-AzureAclConfig Set-AzureAclConfig Remove-AzureAclConfig Add-AzureEndpoint -ACL Set-AzureEndpoint –ACL The following example shows how to add an ACL rule to an existing endpoint of a VM. Other improvements for Virtual Machine management includes Added -NoWinRMEndpoint parameter to New-AzureQuickVM and Add-AzureProvisioningConfig to disable Windows Remote Management Added -DirectServerReturn parameter to Add-AzureEndpoint and Set-AzureEndpoint to enable/disable direct server return Added Set-AzureLoadBalancedEndpoint cmdlet to modify load balanced endpoints Cloud Services: Remote Desktop and Diagnostics Remote Desktop and Diagnostics are popular debugging options for Cloud Services. We’ve introduced cmdlets to help you configure these two Cloud Service extensions from Windows Azure PowerShell. Windows Azure Cloud Services Remote Desktop extension: New-AzureServiceRemoteDesktopExtensionConfig Get-AzureServiceRemoteDesktopExtension Set-AzureServiceRemoteDesktopExtension Remove-AzureServiceRemoteDesktopExtension Windows Azure Cloud Services Diagnostics extension New-AzureServiceDiagnosticsExtensionConfig Get-AzureServiceDiagnosticsExtension Set-AzureServiceDiagnosticsExtension Remove-AzureServiceDiagnosticsExtension The following example shows how to enable Remote Desktop for a Cloud Service. Web Sites: Diagnostics With our last SDK update, we introduced the Get-AzureWebsiteLog –Tail cmdlet to get the log streaming of your Web Sites. Recently, we’ve also added cmdlets to configure Web Site application diagnostics: Enable-AzureWebsiteApplicationDiagnostic Disable-AzureWebsiteApplicationDiagnostic The following 2 examples show how to enable application diagnostics to the file system and a Windows Azure Storage Table: SQL Database Previously, you had to know the SQL Database server admin username and password if you want to manage the database in that SQL Database server. Recently, we’ve made the experience much easier by not requiring the admin credential if the database server is in your subscription. So you can simply specify the -ServerName parameter to tell Windows Azure PowerShell which server you want to use for the following cmdlets. Get-AzureSqlDatabase New-AzureSqlDatabase Remove-AzureSqlDatabase Set-AzureSqlDatabase We’ve also added a -AllowAllAzureServices parameter to New-AzureSqlDatabaseServerFirewallRule so that you can easily add a firewall rule to whitelist all Windows Azure IP addresses. Besides the above experience improvements, we’ve also added cmdlets get the database server quota and set the database service objective. Check out the following cmdlets for details. Get-AzureSqlDatabaseServerQuota Get-AzureSqlDatabaseServiceObjective Set-AzureSqlDatabase –ServiceObjective Storage and Service Bus Other new cmdlets include Storage: CRUD cmdlets for Azure Tables and Queues Service Bus: Cmdlets for managing authorization rules on your Service Bus Namespace, Queue, Topic, Relay and NotificationHub Summary Today’s release includes a bunch of great features that enable you to build even better cloud solutions.  All the above features/enhancements are shipped and available to use immediately as part of the 2.1 release of the Windows Azure SDK for .NET. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Cloud 9 and Grunt.js

    - by Michael Ryan Soileau
    I'm running grunt.js on Cloud9. Most everything is working correctly, except when I try to set this option: 'watch: { options: {livereload:true},' If I add that, the terminal states: 'Fatal error: listen EACCES' I'm guessing I need to use the sudo command to run that and since c9 doesn't let you run sudo, the command fails. But why is livereload a feature that requires permission? And is there any way around it?

    Read the article

  • PHP ODBC MDB Access on a Cloud Server

    - by Senica Gonzalez
    Hey! Hopefully quick question.... I have a .MDB file stored on my webserver and I'm trying to connect to it. I have no way of "registering" it with a name in ODBC. Is the only way to connect to it by specifying the absolute page of the .mdb file? $mdbFilename = "./db/Scora.mdb"; $connection = odbc_connect("Driver={Microsoft Access Driver (*.mdb)};Dbq=$mdbFilename","",""); if (!$connection) { echo "Couldn't make a connection!"; } $sql = "SELECT ID FROM ScoraRegistrations"; $sql_result = odbc_prepare($connection,$sql); odbc_execute($sql_result); odbc_result_all($sql_result,"border=1"); odbc_free_result($sql_result); odbc_close($connection); It never connects. Any thoughts?

    Read the article

  • Android GPS cloud of confusion!

    - by Anthony Forloney
    I am trying to design my first Android application with the use of GPS. As of right now, I have a drawable button that when clicked, alerts a Toast message of the longitude and latitude. I have tried to use the telnet localhost 5554 and then geo fix #number #number to feed in values but no results display just 0 0. I have also tried DDMS way of sending GPS coordinates and I get the same thing. My question is what exactly is the code equivalent to the geo fix and the DDMS way of sending coordinates. I have used Location, LocationManger and LocationListener but I am not sure which is the right choice. Could anyone explain to me what the code-equivalent just so I can get a better understanding of how to fix my application not working. Code is given, just in case if the error exists with the code @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); Button button = (Button) findViewById(R.id.track); button.setOnClickListener(this); LocationManager location =(LocationManager)getSystemService(Context.LOCATION_SERVICE); Location loc = location.getLastKnownLocation(location.GPS_PROVIDER); updateWithNewLocation(loc); } private final LocationListener locationListener = new LocationListener() { public void onLocationChanged(Location location) { updateWithNewLocation(location); } private void updateWithNewLocation(Location l) { longitude = l.getLongitude(); latitude = l.getLatitude(); provider = l.getProvider(); } public void onClick(View v) { Toast.makeText(this, "Your location is " + longitude + " and " + latitude + " provided by: " + provider, Toast.LENGTH_SHORT).show(); } }

    Read the article

  • Mail in the cloud? (web service for email)

    - by bambax
    I have multiple websites that need to send mail (after successful signup, etc.); I maintain a Postfix instance just for this (it never receives any incoming mail). This Postfix instance is a pain: it is sometimes broken, is regularly attacked, produces enormous amounts of logs, etc. These problems could certainly be dealt with by an experienced Postfix admin, but that's not me. Is there a service that would be just like Twilio, but for email? IE: a web service for sending mail.

    Read the article

  • Can't seem to sign AWS Cloud Front URL properly

    - by Joe Corkery
    Hi everybody, I found a lot of detailed examples online on how to sign an Amazon CloudFront URL for private content. Unfortunately, whenever I implement these examples my URL doesn't seem to work. The resource path is correct because I can download the file when it is set for world read, but the URL doesn't work when set just for authorized users. The PHP code I am using is below. If anybody has any insights as to what I might be doing wrong (I'm guessing it is something obvious that I am just not seeing right now), it would be greatly appreciated. function urlCloudFront($resource) { $AWS_CF_KEY = 'APKA...'; $priv_key = file_get_contents(path_to_pem_file); $pkeyid = openssl_get_privatekey($priv_key); $expires = strtotime("+ 3 hours"); $policy_str = '{"Statement":[{"Resource":"'.$resource.'","Condition":{"DateLessThan":{"AWS:EpochTime":'.$expires.'}}}]}'; $policy_str = trim( preg_replace( '/\s+/', '', $policy_str ) ); $res = openssl_sign($policy_str, $signature, $pkeyid, OPENSSL_ALGO_SHA1); $signature_base64 = (base64_encode($signature)); $repl = array('+' => '-','=' => '_','/' => '~'); $signature_base64 = strtr($signature_base64,$repl); $url = $resource . '?Expires=' .$expires. '&Signature=' . $signature_base64 . '&Key-Pair-Id='. $AWS_CF_KEY; print '<p><A href="' .$url. '">Download VIDA (CloudFrount)</A>'; } urlCloudFront("http://mydistcloud.cloudfront.net/mydir/myfile.tar.gz"); Thanks.

    Read the article

  • Custom Online Backup Solution Advice

    - by Martín Marconcini
    I have to implement a way so our customers can backup their SQL 2000/5/8 databasase online. The application they use is a C#/.NET35 Winforms application that connects to a SQL Server (can be 2000/2005/2008, sometimes express editions). The SQL Server is on the same LAN. Our application has a very specific UI and we must code each form following those guidelines. There’s lots of GDI+ to give it the look and feel we want. For that reason, using a 3rd party application is not a very good idea. We need to charge the customer on a monthly/annual basis for the service. Preferably, the customer doesn’t need to care about bandwidth and storage space. It must be transparent. Given the above reqs., my first thoughts are: Solution 1: Code some sort of FTP basic functionality with behind the scenes SQL Backup mechanism, then hire a Hosting service and compress-transfer the .BAK to the Hosting. Maintain a series of Folders (for each customer). They won’t see what’s happening. They will just see a list of their files and a big “Backup now” button that will perform the SQL backup, compress it and upload it (and update the file list) ;) Pros: Not very complicated to implement, simple to use, fairly simple to configure (could have a dedicated ftp user/pass) Cons: Finding a “ftp” only hosting plan is not probably going to be easy, they usually come with a bunch of stuff. FTP is not always the best protocol. more? Solution 2: Similar to 1, but instead of FTP, find a cloud computing service like Amazon S3, Mosso or similar. Pros: Cloud Storage is fast, reliable, etc. It’s kind of easy to implement (specially if there are APIs like AWS or Mosso). Cons: I have been unable to come up with a service optimized for resellers where I can give multiple sub-accounts (one for each customer). Billing is going to be a nightmare cuz these services bill per/GB and with One account it’s impossible to differentiate each customer. Solution 3: Similar to 2, but letting the user create their own account on Amazon S3 (for example). Pros: You forget about billing and such. Cons: A mess for the customer who has to open the Amazon (or whatever) account, will be charged for that and not from you. You can’t really charge the customer (since you’re just not doing anything). Solution 4: Use one of the many backup online solutions that use the tech in cloud storage. Pros: many of these include SQL Server backup, and a lot of features that we’d have to implement. Plus web access and stuff like that will come included. Cons: Still have the billing problem described in number 2. Little of these companies (if any) offers “reseller” accounts. You have to eventually use their software (some offer certain branding). Any better approach? Summary: You have a software (.NET Winapp). You want your users to be able to backup their SQL Server databases online (and be able to retrieve the backups if needed). You ideally would like to charge the customer for this service (i.e. XX € a year).

    Read the article

  • Windows Azure Table Storage Error when in the cloud

    - by Dan Jones
    hey, I downloaded the simple table storage sample on Cloudy in Seattle blog It works perfect when aimed at local storage but when I change to point to azure storage i get the following error Screenshot (on skydrive) http://cid-00341536d0f91b53.skydrive.live.com/self.aspx/.Public/error.png anyone see this before? thanks guys

    Read the article

  • Cloud HUGE data storage options?

    - by ToughPal
    Hi, Does anyone have a good suggestion on how to do video recording? We have a camera that can record and then stream live video to a server. So this means we can have 1000's of cameras sending data 24X7 for recording. We will store data for over 7 / 14 / 30 days depending on the package. Per day if a camera is sending data to the server then it will store 1.5GB. So that means there is a traffic of 1.5GB / day / camera Total monthly 45GB / month / camera (Data + bandwidth for one camera) Please let me know the most cost effective way to get this data stored? Thanks!

    Read the article

  • Starting socket server in ruby on rails on cloud environments (heroku)

    - by ElTren
    Hi, I'm using heroku, and I can push a Ruby on Rails app just fine, I'm trying to convert this to a Socket server, basically I would need to bind to an open port, in this case, I know Heroku only does 80 22 and 443. Is it possible to bind to port 80 on those environments? Also, how would I setup the entry point for this socket server, all I know is that when script/server it boots up the app. Do I have to put the function call there? How can a socket server start instead of the rails app on top of whatever webserver heroku has.

    Read the article

  • Installing Plugins from Cloud p2 repository in Eclipse IDE

    - by user1495036
    I have been recently reading a lot about p2 for a requirement of mine. Most of the p2 documentation online points to p2 for RCP. My requirement is for a plugin repo. I have a plugin that is used within Eclipse IDE. I dnt want to change the repo location but based on the Eclipse Version, if the user looks for Install New Software or Check for Updates it needs to download the respective plugins. My repo currently contains all the plugins for all the versions. but i need to everytime give a different URL to my user based on the Version. For e.g i am using Eclipse 3.7(Indigo). I install the plugin thru Install New Software by adding the p2 Repo URL. Now the user decides to for some requirement move to Eclipse 3.6, I want him to connect to the same p2 Repo URL and download the plugins created for Eclipse 3.6. This is definitely possible using p2 Discovery, or i could categorize the downloads using composite repository but i dnt want to do any of these. Just want to kno is there any API that i can hold on to, so that before processing the URL and finding the updates, i can check the version of Eclipse and redirect it based on the version to an internal URL. This is possible in RCP, want to kno if i can do it in Eclipse p2 UI. All the p2 UI looks to be internal classes. Any directives would be appreciated. Malai

    Read the article

  • Google Cloud Messaging

    - by atar
    I downloaded the GCM project from git and i was able to start the server (app engine for java) and also compile the client apk on the device. On startup the device successfully registers with the server and I can see the registration id when i go to http://myserverip:8080/_ah/admin/. The problem is when i hit the 'Send Message' button there is a message 'Single message queued for registration id ' and it doesn't do anything. Below is a screenshot of the error i noticed on the terminal. Am i missing something. Any help is appreciated. Thanks. .

    Read the article

  • Why is IaaS important in Azure&hellip;

    - by Steve Loethen
    Three weeks ago, Microsoft released the next phase of Azure.  I have had several clients waiting on this release.  The fact that they have been waiting and are now more receptive to looking at the cloud.  Customers expressed fear of the unknown.  And a fear of lack of control, even when that lack of control also means a huge degree of flexibility to innovate with concerns about the underlying infrastructure.  I think IaaS will be that “gateway drug” to get customers who have been hesitant to take another look at the cloud.  The dialog can change from the cloud being this big scary unknown to a resource for workloads.  The conversations should have always been, and can know be even stronger, geared toward the following points: 1) The cloud is not unicorns and glitter, the cloud is resources.  Compute, storage, db’s, services bus, cache…..  Like many of the resources we have on-premise.  Not magic, just another resource with advantages and obstacles like any other resource. 2) The cloud should be part of the conversation for any new project.  All of the same criteria should be applied, on-premise or off.  Cost, security, reliability, scalability, speed to deploy, cost of licenses, need to customize image, complex workloads.  We have been having these discussions for years when we talk about on-premise projects.  We make decisions on OS’s, Databases, ESB’s, configuration and products based on a myriad of factors.  We use the same factors but now we have a additional set of resources to consider in our process. 3) The cloud is a great solution looking for some interesting problems.  It is our job to recognize the right problems that fit into the cloud, weigh the factors and decide what to do. IaaS makes this discussion easier, offers more choices, and often choices that many enterprises will find more better than PaaS.  Looking forward to helping clients realize the power of the cloud.

    Read the article

  • Install and upgrade strategies for Oracle Enterprise Manager 12c - Upcoming Webcasts with live demos

    - by Anand Akela
    At Oracle Open World 2011, we launched the Oracle Enterprise Manager 12c , the only complete cloud management solution for your enterprise cloud. With the new release of Oracle Enterprise Manager Cloud Control 12c, the installation and upgrade process has been enhanced to provide a fast and smooth install experience. In the upcoming webcasts, Oracle Enterprise Manager experts will discuss the installation and upgrade strategies for Oracle Enterprise Manager Cloud Control 12c . These webcasts will include live demonstrations of the install and upgrade processes. In the Webcast on November 17th, we will cover the installation steps and provide recommendations to setup a new Oracle Enterprise Manager Cloud Control 12c environment. We'll also provide a live demonstration of the complete installation process.   Upgrading your Oracle Enterprise Manager environment can be a challenging and complex task especially with large environments consisting of hundreds or thousands of targets. In the webcast on November 18th, we'll describe key facts that administrators must know before upgrading their Enterprise Manager system as well as introduce the different approaches for an upgrade. We'll also walk you through the key steps for upgrading an existing Enterprise Manager 11g (or 10g) Grid Control to Oracle Enterprise Manager Cloud Control 12c. In addition to the live webcasts on Oracle Enterprise Manager Cloud Control 12c install and upgrade processes, please consider attending the replay of  Oracle Enterprise Manager Ops Center webcast with live Q&A . Schedule and registration links of upcoming webcasts  :- Topics Schedule Oracle Enterprise Manager Ops Center: Global Systems Management Made Easy (Replay) November 17 10 a.m PT December 1 10 a.m PT Oracle Enterprise Manager Cloud Control 12c Installation Overview November 17 8 a.m PT Upgrade Smoothly to Oracle Enterprise Manager Cloud Control 12c November 18 8 a.m PT For more information, please go to Oracle Enterprise Manager  web page or  follow us at :  Twitter   Facebook YouTube Linkedin

    Read the article

  • Oracle Enterprise Manager content at Collaborate 12 - the only user-driven and user-run Oracle conference

    - by Anand Akela
    From April 22-26, 2012, Oracle takes Las Vegas. Thousands of Oracle professionals will descend upon the Mandalay Bay Convention Center for a weeks worth of education sessions, networking opportunities and more, at the only user-driven and user-run Oracle conference - COLLABORATE 12. This is one of the best opportunities for you to learn more about Oracle technology including Oracle Enterprise Manager. Here is a summary of an impressive line-up of Oracle Enterprise Manager related content at COLLABORATE 12. Customer Presentations Stability in Real World with SQL Plan Management Upgrading to Oracle Enterprise Manager 12c - Best Practices Making OEM Sing and Dance with EMCLI Oracle Real Application Testing: A look under the hood Optimizing Oracle E-Business Suite on Exadata Experiences with OracleVM 3 and Grid Control in an Oracle BIEE environment. Right Cloud-- How to Avoid the False Cloud by using Oracle Technologies Forgetting something? Standarize your database monitoring environment with Enterprise Manager 11g Implementing E-Business Suite R12 in a Federal Cloud - Lessons Learned Cloud Computing Boot Camp: New DBA Features in Oracle Enterprise Manager Cloud Control 12c Oracle Enterprise Manager 12c, Whats Changed, Whats New? Monitoring a WebCenter Content Deployment with Enterprise Manager Enterprise Manager 12c Cloud Control: New Features and Best Practices (for IOUG registrants only) Oracle Presentations Roadmap Session: Total Cloud Control with Oracle Enterprise Manager 12c Real World Performance (complimentary for IOUG registrants only) Database-as-a-Service: Enterprise Cloud in Three Simple Steps Bullet-proof Your Enterprise, SOA & Cloud Investments Using Oracle Enterprise Gateway What’s New for Oracle WebLogic Management: Capabilities that Scripting Cannot Provide Exadata Boot Camp: Complete Oracle Exadata Management with Oracle Enterprise Manager Stay connected with  Oracle Enterprise Manager   :  Twitter | Facebook | YouTube | Linkedin | Newsletter

    Read the article

  • Guest Blog: Secure your applications based on your business model, not your application architecture, by Yaldah Hakim

    - by Darin Pendergraft
    Today’s businesses are looking for new ways to engage their customers, embrace mobile applications, while staying in compliance, improving security and driving down costs.  For many, the solution to that problem is to host their applications with a Cloud Services provider, but concerns that a hosted application will be less secure continue to cause doubt. Oracle is recognized by Gartner as a leader in the User Provisioning and Identity and Access Governance magic quadrants, and has helped thousands of companies worldwide to secure their enterprise applications and identities.  Now those same world class IDM capabilities are available as a managed service, both for enterprise applications, as well has Oracle hosted applications. --- Listen to our IDM in the cloud podcast to hear Yvonne Wilson, Director of the IDM Practice in Cloud Service, explain how Oracle Managed Services provides IDM as a service ---Selecting OracleManaged Cloud Services to deploy and manage Oracle Identity Management Services is a smart business decision for a variety of reasons. Oracle hosted Identity Management infrastructure is deployed securely, resilient to failures, and supported by Oracle experts. In addition, Oracle  Managed Cloud Services monitors customer solutions from several perspectives to ensure they continue to work smoothly over time. Customers gain the benefit of Oracle Identity Management expertise to achieve predictable and effective results for their organization.Customers can select Oracle to host and manage any number of Oracle IDM products as a service as well as other Oracle’s security products, providing a flexible, cost effective alternative to onsite hardware and software costs.Security is a major concern for all organizations- making it increasingly important to partner with a company like Oracle to ensure consistency and a layered approach to security and compliance when selecting a cloud provider.  Oracle Cloud Service makes this possible for our customers by taking away the headache and complexity of managing Identity management infrastructure and other security solutions. For more information:http://www.oracle.com/us/solutions/cloud/managed-cloud-services/overview/index.htmlTwitter-https://twitter.com/OracleCloudZoneFacebook - http://www.facebook.com/OracleCloudComputing

    Read the article

  • SAPPHIRE 2012 : « 80 % de nos clients sont des PME », SAP revient sur l'évolution de ses solutions Cloud pour répondre à leurs besoins

    SAPPHIRE 2012 : « 80 % de nos clients sont des PME » SAP revient sur ses solutions Cloud et ses évolutions pour répondre à leurs besoins SAP n'a pas l'image d'un éditeur qui s'adresse aux petites entreprises. Et pourtant, 80% de ses clients sont des PME. Le chiffre est avancé par Chris Horak, vice-président en charge des solutions Cloud, dans un entretien à Developpez.com lors du SAPPHIRE NOW 2012. Il est vrai que la catégorie PME regroupe des structures diverses allant du petit au très gros. Une solution comme Business One, qui compte aujourd'hui 30.000 clients, vise cependant bien les plus petites structures. Adaptée pour les entreprises ayant entre...

    Read the article

  • Oracle annonce Oracle Cloud Office et Oracle Open Office 3.3 pour concurrencer les Google Docs et les Office Web Apps

    Oracle annonce Oracle Cloud Office et Oracle Open Office 3.3 Pour concurrencer les Google Docs et les Office Web Apps Oracle vient d'annoncer l'arrivée de Oracle Cloud Office et Open Office 3.3, ses deux suites de productivité bureautique complètes et basées sur les standards ouverts, destinées aux postes de travail, au Web et aux terminaux mobiles. Fondé sur le format ODF (Open Document Format) et les standards ouverts du Web, Oracle Office permet aux utilisateurs de partager des fichiers depuis n'importe quel système. La suite est « en même temps compatible avec les anciens documents Microsoft Office et les systèmes les plus modernes de publication Web 2.0 ».

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >