Search Results

Search found 9847 results on 394 pages for 'cloud backup'.

Page 126/394 | < Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >

  • Postfix issues sending mail to addresses under domain located on server

    - by iamthewit
    I recently installed virtualmin on my nice shiny new rackspace cloud. Everything went seemlessly but I've been having some issues getting emails to send properly. The problem seems to be that the server can not send mail to email addresses where the domain is owned by my server. For example, on my server I run multiple virtual domains, lets call this one test.com. When I run the mail command from shell (mail [email protected]) I get the following back from my maillog: Oct 6 14:55:18 test postfix/pickup[8737]: DC1131612CC: uid=0 from= Oct 6 14:55:18 test postfix/cleanup[8769]: DC1131612CC: [email protected] Oct 6 14:55:18 test postfix/qmgr[8738]: DC1131612CC: [email protected], size=353, nrcpt=1 (queue active) Oct 6 14:55:18 test postfix/error[8771]: DC1131612CC: [email protected], relay=none, delay=0, delays=0/0/0/0, dsn=5.0.0, status=bounced (User unknown in virtual alias table) Oct 6 14:55:18 test postfix/cleanup[8769]: DD07D1612D1: [email protected] Oct 6 14:55:18 test postfix/bounce[8772]: DC1131612CC: sender non-delivery notification: DD07D1612D1 Oct 6 14:55:18 test postfix/qmgr[8738]: DD07D1612D1: from=<, size=2268, nrcpt=1 (queue active) Oct 6 14:55:18 test postfix/qmgr[8738]: DC1131612CC: removed Oct 6 14:55:18 test postfix/local[8773]: DD07D1612D1: [email protected], relay=local, delay=0.03, delays=0/0/0/0.03, dsn=2.0.0, status=sent (delivered to command: /usr/bin/procmail-wrapper -o -a $DOMAIN -d $LOGNAME) Oct 6 14:55:18 test postfix/qmgr[8738]: DD07D1612D1: removed when I run mail [email protected] the message is sent and received perfectly fine. I'm a bit of a noob when it comes to servers, but I pick things up fairly quickly, so please excuse any incorrect terminology and my general noobiness. Any help would be greatly appreciated, I've been googling for quite a while but I haven't found a solution yet, I'll add a copy of my main.cf file in a response below cheers guys here is the reformatted postconf, do you want the reformatted main.cf file too, or is this enough? alias_database = hash:/etc/postfix/aliases alias_maps = hash:/etc/postfix/aliases broken_sasl_auth_clients = yes command_directory = /usr/sbin config_directory = /etc/postfix daemon_directory = /usr/libexec/postfix debug_peer_level = 2 home_mailbox = Maildir/ html_directory = no mailbox_command = /usr/bin/procmail-wrapper -o -a $DOMAIN -d $LOGNAME mailq_path = /usr/bin/mailq.postfix manpage_directory = /usr/share/man myhostname = server.test.com newaliases_path = /usr/bin/newaliases.postfix readme_directory = /usr/share/doc/postfix-2.3.3/README_FILES sample_directory = /usr/share/doc/postfix-2.3.3/samples sender_bcc_maps = hash:/etc/postfix/bcc sendmail_path = /usr/sbin/sendmail.postfix setgid_group = postdrop smtpd_recipient_restrictions = permit_mynetworks permit_sasl_authenticated reject_unauth_destination smtpd_sasl_auth_enable = yes smtpd_sasl_security_options = noanonymous unknown_local_recipient_reject_code = 550 virtual_alias_maps = hash:/etc/postfix/virtual

    Read the article

  • Restore Files from Backups on Windows Home Server

    - by Mysticgeek
    If you use Windows Home Server to backup the machines on your network, your in luck if you accidentally delete important files or they become corrupted. Today we take a look at getting your data back from backups on your home server. Open Windows Home Server Console and click select the Computers and Backup tab. Right-click on the computer you need to restore files for and select View Backups. This will open a list of your recent backups. Highlight the one you want to open, then click the Open button in the Restore or View Files section. If this is the first time you’re restoring a file, you’ll be asked to verify installation of the device software. Check the box next to Always trust software from Microsoft Corporation and click Install. Now wait while the backup data is retrieved. After the backup data has been retrieved, an explorer windows opens up to drive (Z:) which is the backup data. It’s just like if you were opening a drive on your local machine. Now you can browse through the backup and find the files your missing. You can open the files directly, or drag them onto your machine to the location you want to restore them.   Restoring your data is actually a very easy process with Windows Home Server. Of course you’ll want to make sure the computers on your network are being backed up to WHS. if you need help with that, check out our article on how to configure your computer to backup to WHS. If you want to backup your home server shares, check out our article on how to backup WHS folder to an external drive. Similar Articles Productive Geek Tips GMedia Blog: Setting Up a Windows Home ServerRestore Your PC from Windows Home ServerCreate A Windows Home Server Home Computer Restore DiscInstalling Windows Home ServerConfigure Your Computer to Backup to Windows Home Server TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Make your Joomla & Drupal Sites Mobile with OSMOBI Integrate Twitter and Delicious and Make Life Easier Design Your Web Pages Using the Golden Ratio Worldwide Growth of the Internet How to Find Your Mac Address Use My TextTools to Edit and Organize Text

    Read the article

  • links for 2011-01-06

    - by Bob Rhubart
    Coming to your town: Oracle Enterprise Cloud Summit During these full-day events, cloud experts will share real-world best practices, reference architectures, detailed customer case studies, and more. Events scheduled in cities around the world.  (tags: oracle otn cloud event) Webcast: Security and Compliance for Private Cloud Consolidation Roxana Bradescu, Senior Director for Oracle Database Security Products, discusses Oracle Database Security Solutions to securely consolidate data and meet compliance requirements within private cloud computing environments. Thursday, January 13, 2011. 10am PST | 1pm EST (tags: oracle cloud security) Answering Questions about Mobile Devices | The AppsLab "How do the numbers of Android and iOS users compare? How often are people switching? Where are all these BlackBerry and Nokia users? Do they plan to jump to Android or iOS? What about webOS? Is it relevant?" Some answers in this AppsLab survey. (tags: oracle otn enterprise2.0 mobilecomputing iphone blackberry android) Webcast: Achieve 24/7 Cloud Availability Without Expensive Redundancy Ashish Ray and Matthew Baier discuss Oracle’s Maximum Availability Architecture and Oracle Database 11g. (tags: oracle cloud highavailability webcast) Converting a PV vm back into an HVM vm (Wim Coekaerts Blog) "I wanted to convert one of my VMs that was based on a paravirt kernel into a vm that just boots as a regular hardware virt VM with a standard x86-64 kernel...It took me a little while to figure out the fastest way so now that I have it pretty much down I wanted to share the steps." - Wim Coekaerts (tags: oracle otn virtualization oraclevm) @OTN_Garage: Resources for VirtualBox 4.0 Rick "@OTN_Garage" Ramsey shares links to several resources for those with a VirtualBox jones. (tags: oracle otn virtualization virtualbox) 'Federal Service Bus' Helps Belgian Government Speak a Common Language - SOA in Action Blog "The first SOA-enabled application was developed in less than two months and was fully operational in approximately 10 weeks. In addition, new FSB modules are reusable for other Belgian e-government applications, saving both time and taxpayer dollars." - Joe McKendrick (tags: soa oracle) Show Notes: Architects in the Cloud (ArchBeat Podcast) The complete 4-part interview with Stephen G. Bennett and Archie Reed, the authors of "Silver Clouds, Dark Linings: A Concise Guide to Cloud Computing," is now available. (tags: oracle otn cloud podcast archbeat)

    Read the article

  • BizTalk 2009 - SQL Server Job Configuration

    - by StuartBrierley
    Following the installation of Biztalk Server 2009 on my development laptop I used the BizTalk Server Best Practice Analyser which highlighted the fact that two of the SQL Server Agent jobs that BizTalk relies on were not running successfully.  Upon investigation it turned out that these jobs needed to be configured before they would run successfully. To configure these jobs open SQL Server Management Studio, expand SQL Server Agent > Jobs and double click on the appropriate job.  Select Steps and then edit the appropriate entries. Backup BizTalk Server (BizTalkMgmtDb) This job is comprised of three steps BackupFull, MarkAndBackupLog and ClearBackupHistory. BackupFull exec [dbo].[sp_BackupAllFull_Schedule] ‘d’ /* Frequency */,‘BTS’ /* Name */,‘<destination path>’ /* location of backup files */ The frequency here is set/left as daily The name is left as BTS You must provide a full destination path for the backup files to be stored. There are also two optional parameters: A flag that controls if the job forces a full backup if a partial backup fails A parameter to control the time of day to run the full backup; the default is midnight UTC time For example: exec [dbo].[sp_BackupAllFull_Schedule] ‘d’ /* Frequency */,‘BTS’ /* Name */,‘<destination path>’ /* location of backup files */ , 0, 22 MarkAndBackUpLog exec [dbo].[sp_MarkAll] ‘BTS’ /* Log mark name */,’<destination path>’  /*location of backup files */ You must provide a destination path for the log backups. Optionally you can also add an extra parameter that tells the procedure to use local time: exec [dbo].[sp_MarkAll] ‘BTS’ /* Log mark name */,’<destination path>’  /*location of backup files */ ,1 Clear Backup History exec [dbo].[sp_DeleteBackupHistory] @DaysToKeep=7 This will clear out the instances in the MarkLog table older than 7 days.    DTA Purge and Archive (BizTalkDTADb) This job is comprised of a single step. Archive and Purge exec dtasp_BackupAndPurgeTrackingDatabase 0, --@nLiveHours tinyint, 1, --@nLiveDays tinyint = 0, 30, --@nHardDeleteDays tinyint = 0, null, --@nvcFolder nvarchar(1024) = null, null, --@nvcValidatingServer sysname = null, 0 --@fForceBackup int = 0 Any completed instance that is older than the live days plus live hours will be deleted, as will any associated data. Any data older than the HardDeleteDays will be deleted - this means that those long running orchestration instances that would otherwise never be purged will at some point have their data cleared down while allowing the instance to continue, thus preventing the DTA databse from growing indefinitely.  This should always be greater than the soft purge window. The NVC folder is the path for the backup files, if this is null the job will not run failing with the error : DTA Purge and Archive (BizTalkDTADb) Job failed SQL Server Management Studio, job activity monitor, view history The @nvcFolder parameter cannot be null. Archive and Purge step How long you choose to keep instances in the Tracking Database is really up to you. For development I have set this up as: exec dtasp_BackupAndPurgeTrackingDatabase 0, 1, 30, ’<destination path>’, null, 0 On a live server you may want to adjust these figures: exec dtasp_BackupAndPurgeTrackingDatabase 0, 15, 20, ’<destination path>’, null, 0

    Read the article

  • Backup VM : Copy virtual disk xxx.vmdk The virtual disk is either corrupted or not a supported forma

    - by boiteavinc
    Hi I'm french. I'm student and I use ESXI4 for my studies. When I try backup my VMs, with ghettoVCBg2.pl and vSphere Management Assistant, I get the following error message on Vsphere Client : "Copy virtual disk xxx.vmdk The virtual disk is either corrupted or not a supported format". I have no error in the log VCB ; 03-23-2010 08:31:56 -- debug: Main: Login by vi-fastpass to: esxi 03-23-2010 08:31:56 -- debug: copyTask: Task START 03-23-2010 08:31:56 -- debug: copyTask: waiting for next job and sleep ... 03-23-2010 08:32:00 -- info: Initiate backup for AD_DNS_DHCP found on esxi 03-23-2010 08:32:09 -- debug: AD_DNS_DHCP original powerState: poweredOn 03-23-2010 08:32:09 -- debug: Creating Snapshot "ghettoVCBg2-snapshot-2010-03-23" for AD_DNS_DHCP 03-23-2010 08:33:19 -- info: AD_DNS_DHCP has 1 VMDK(s) 03-23-2010 08:33:19 -- debug: backupVMDK: Backing up "Raptor1 AD_DNS_DHCP/AD_DNS_DHCP.vmdk" to "Backup_VM VM/AD_DNS_DHCP/AD_DNS_DHCP-2010-03-23/$ 03-23-2010 08:33:19 -- debug: backupVMDK: Signal copyThread to start 03-23-2010 08:33:19 -- debug: backupVMDK: Backup progress: Elapsed time 0 min 03-23-2010 08:33:19 -- debug: copyTask: Wake up and follow the white rabbit, with status: doCopy 03-23-2010 08:33:19 -- debug: CopyThread: Start backing up VMDK(s) ... 03-23-2010 08:33:25 -- debug: copyTask: send copySuccess message ... 03-23-2010 08:33:25 -- debug: copyTask: waiting for next job and sleep ... 03-23-2010 08:34:20 -- debug: backupVMDK: Successfully completed backup for Raptor1 AD_DNS_DHCP/AD_DNS_DHCP.vmdk Elapsed time: 1 min 03-23-2010 08:34:22 -- debug: Removing Snapshot "ghettoVCBg2-snapshot-2010-03-23" for AD_DNS_DHCP 03-23-2010 08:34:24 -- debug: checkVMBackupRotation: Starting ... 03-23-2010 08:34:26 -- debug: Purging Backup_VM VM/AD_DNS_DHCP/AD_DNS_DHCP-2010-03-23--1 due to rotation max 03-23-2010 08:34:28 -- info: Backup completed for AD_DNS_DHCP! 03-23-2010 08:34:28 -- debug: Main: Disconnect from: esxi 03-23-2010 08:34:28 -- debug: Main: Calling final clean up 03-23-2010 08:34:28 -- debug: cleanUP: Thread clean up starting ... 03-23-2010 08:34:28 -- debug: cleanUp: Send exit to copyThread 03-23-2010 08:34:28 -- debug: copyTask: Wake up and follow the white rabbit, with status: exit 03-23-2010 08:34:28 -- debug: copyTask: die ... 03-23-2010 08:34:28 -- debug: cleanUp: Join passed 03-23-2010 08:34:28 -- info: ============================== ghettoVCBg2 LOG END ============================== My ghetto conf file is : VM_BACKUP_DATASTORE = "Backup_VM" VM_BACKUP_DIRECTORY = "VM" VM_BACKUP_ROTATION_COUNT = "3" DISK_BACKUP_FORMAT = "thin" ADAPTER_FORMAT = "lsilogic" POWER_VM_DOWN_BEFORE_BACKUP = "0" VM_SNAPSHOT_MEMORY = "1" VM_SNAPSHOT_QUIESCE = "1" LOG_LEVEL = "info" VM_VMDK_FILES = "all" I tried several I tried several DISK_BACKUP_FORMAT, I have same error. Despite the error, even when I get files on the NFS share. But when I try to open the vmx file with vmware workstation. I get this error: Can not open the disk 'G: \ Backup ESX \ VM \ AD_DNS_DHCP \ AD_DNS_DHCP-2010-03-23 - 1 \ AD_DNS_DHCP.vmdk' or one of the snapshot disks it depends on. Reason: The called function can not be performed on partial chains. Please open the parent virtual disk. I have no snapshot on my VM on ESXi. Can you help me ?

    Read the article

  • Where should I go with hosting my site: VPS, GAE, another option?

    - by Jonathan Hayward
    My website, http://JonathansCorner.com/, began life before 1994 as www.imsa.edu/~jhayward/ and has been through various iterations and improvements to content, HTML, and the like, but remains a literature site that is from a web administrator's perspective fairly simple and primitive: a fair amount of static HTML and supporting files, a little bit of CGI and URI rewriting, .htaccess files providing Expires: headers and the like. An associated site demoes various CGI scripts that fall under the category of "and other creations"; the site as a whole has the purpose of sharing my creative works, and so far a fairly rudimentary use of Apache functionality, supported by Unix tools to, for instance, update RSS feed and the "starting point" link on the home page, has served that purpose fairly well. I looked around here on web hosting, and found the note on web host reccommendations as a good note for "What are some of people's favorite web hosts overall," but I wanted to ask a more focused question of "What are the best web hosts for criteria XYZ:" I am looking at a VPS so I will have root, be able to install stuff and edit Apache's config files etc., running Gentoo or other Linux, BSD, or the like. I would like a system that is secure enough that the host's vulnerabilities are mostly the ones that come along with what I am trying to do: that is, I won't be trying to administer and secure an ancient Linux like some have complained about at 1and1. I would like good uptime/reliability and competent support staff: if the level 1 help desk is going to tell me to go to "My Computer" on a Linux box, I'd like to be able to get past them. Ideally I would like a site hosted within some place that will have low latency for U.S. visitors in particular. I would like a hosting solution that will be with a stable business, one that will probably be around, and one unlikely to vanish without warning. With those things specified, I would be interested in knowing what are the less expensive options. (I expect that some of the things I've specified will knock out all of the cheapest options, but I'm still interested in price.) With all that stated, I'd like to back up a bit and look at whether I am asking the right question. I am concerned that the above is a very good way of asking, "How can I keep my site in line with the wave of the past?" I am wondering if it might be specifically wiser to look to adapt my site to newer technologies instead of trying to keep it on older technologies. For instance, while I would hardly portray my site as a way to show off the full power of Google App Engine, the main site at least should be a straightforward port if I were to do that. And beyond Google App Engine, my knowledge of cloud solutions is basic. If it is a better and more future-proof solution to port my site to another kind of solution, I would be interested in knowing where those future-proof solutions lie. So I would be interested in wisdom. If the question I asked in detail is still a good question to be asking, what would people suggest? Or if I should seriously consider porting my site to a newer basic option, what should I try there? Any thoughts would be appreciated.

    Read the article

  • How do I resolve the error "The file exists" when restoring a cube backup?

    - by Ant
    I'm trying to restore a cube backup (a .abf file) using SQL Server Management Studio, but I'm getting the error message: The following system error occurred: The file exists. . (Microsoft SQL Server 2005 Analysis Services) (yes, there really are two dots) Does anyone know how to resolve this so I can restore the backup? Here are the steps I'm using: Open Microsoft SQL Server Management Studio Make the connection to the AS server Right-click on the Databases node on the server tree view Choose Restore... Type in a new database name in Restore database Select the backup file in From backup file Enter the correct password Optionally tick Allow database overwrite (it happens both ways) Press OK -- get the above error message

    Read the article

  • Volume Shadow Copy based backup that works with TrueCrypt?

    - by jasonh
    I have enabled whole-drive encryption on my external drive to comply with my company's requirements about data security. I want to be able to make a backup of files on the drive, even while they're in use. Is there any program out there that uses Volume Shadow Copy for backup and also works with a TrueCrypt encrypted drive? I have tried Windows Backup and Macrium Reflect and both act like the drive isn't connected, even though TrueCrypt has mounted it. It would be really nice if there was something that was free, but in order for that to work, it also has to work for commercial purposes since this is my company laptop I'm trying to backup.

    Read the article

  • How can I create a simple Exchange 2010 backup solution?

    - by bduncanj
    I'm sure this question's been asked a dozen times in one form or another, however after much searching, there doesn't appear to be an obvious simple recovery solution for a single Exchange box. We're using Exchange 2010 on a single server, the server hosts the AD and nothing else on the network uses the AD. The intent is to run this server as you would an externally hosted Exchange server - access only via HTTP (RPC mode or OWA) - all other ports blocked. I've a daily backup running, using Windows Server 2008 volume shadow service to backup the Exchange data to an external hard disk. My question is, how do I perform a bare metal recovery of this server? 1) Do I need to be explicitly including the active directory information in this nightly backup, or will it be there by virtue of the fact that this system is the primary AD server and the Windows backup service knows this? 2) I understand I can re-install Server 2008 onto my new hardware (in the case of hardware failure) and then run Exchange 2010 setup.exe with a /recover argument, referencing the backup volume. 3) It is acceptable to have some downtime during this recovery process. But is there anything else I should be aware of? Thanks! Duncan

    Read the article

  • Version Control for developers new to source control

    - by Daisetsu
    I've been writing code for a few years now and our backup strategy has been to zip the entire code directory up every few days and put it somewhere else on your hard drive, or occasionally upload it to some online file hosting service. Unfortunately the file hosting service got canceled without telling me and we lost years of backups. It's come down to the point where I finally have to learn to use version control. The only problems are My boss really doesn't like SVN, he tried it and it had a high learning curve (at least his client). We need a reliable place to host it (we can pay a reasonable amount). Can someone suggest what may be the best version control system and client for a newbie which won't be too annoying. Second what is a good remote version control service?

    Read the article

  • unable to get into windows 7 or ubuntu after system file back on a dual boot system

    - by John Jiang
    I installed ubuntu on my dell xps 9000 which originally has a window 7 installed. So after I did system file back up as told by windows 7 and restarted my computer, I received the following message right after restarting in the black screen: roughly like this: ubuntu grub error, " " symbol cannot be found. I was wondering how I can get access into windows 7 and uninstall ubuntu all together. I actually had the same booting problem after system backup before and the solution was to install two extra ubuntus but I'd prefer not to do that. I'd highly appreciate any help!

    Read the article

  • Backing up locally modified and new source files

    - by eran
    I'm wondering how other programmers are backing up changes that are not under source control yet, be it new files or modified ones. I'm mostly referring to medium size jobs - hardly worth the effort of making a private branch, but taking more than a day to complete. This is not a vendor-specific question - I'd like to see if various products have different solutions to the problem. I'd appreciate answers referring to SVN and distributed SCCs, though. I'm mostly wondering about that latters (Mercurial, GIT etc.) - it's great that you have your own local repo, but do you back it up on a regular basis along with your source files? Note - I'm not asking about a general backup strategy. For that, we have IT. I'm seeking the best way to keep locally modified stuff backed-up before they are checked back into the main repo.

    Read the article

  • Whats the best way to deal with backups for my php/mysql application

    - by spirytus
    I'm creating php application for my client and now thinking what would be the best way to do backups, automatically if possible? I don't have much experience in this area and in case something goes wrong, or if I need to migrate, I would like to have fast way of getting it all back online. I understand "something goes wrong" is a very wide term, but lets say that someone hacks my site and wipes out database and all the files. My app. is written in php/mysql and I got access to cpanel (hosted with justhost.com if that makes any difference :). I used Joomla and it has JoomlaPack that does complete backup almost automatically and in case site fails, its easy to revert, or migrate if necessary. Is there anything like that for my configuration that would make reverting/migration, easy?

    Read the article

  • Announcing the release of the Windows Azure SDK 2.1 for .NET

    - by ScottGu
    Today we released the v2.1 update of the Windows Azure SDK for .NET.  This is a major refresh of the Windows Azure SDK and it includes some great new features and enhancements. These new capabilities include: Visual Studio 2013 Preview Support: The Windows Azure SDK now supports using the new VS 2013 Preview Visual Studio 2013 VM Image: Windows Azure now has a built-in VM image that you can use to host and develop with VS 2013 in the cloud Visual Studio Server Explorer Enhancements: Redesigned with improved filtering and auto-loading of subscription resources Virtual Machines: Start and Stop VM’s w/suspend billing directly from within Visual Studio Cloud Services: New Emulator Express option with reduced footprint and Run as Normal User support Service Bus: New high availability options, Notification Hub support, Improved VS tooling PowerShell Automation: Lots of new PowerShell commands for automating Web Sites, Cloud Services, VMs and more All of these SDK enhancements are now available to start using immediately and you can download the SDK from the Windows Azure .NET Developer Center.  Visual Studio’s Team Foundation Service (http://tfs.visualstudio.com/) has also been updated to support today’s SDK 2.1 release, and the SDK 2.1 features can now be used with it (including with automated builds + tests). Below are more details on the new features and capabilities released today: Visual Studio 2013 Preview Support Today’s Window Azure SDK 2.1 release adds support for the recent Visual Studio 2013 Preview. The 2.1 SDK also works with Visual Studio 2010 and Visual Studio 2012, and works side by side with the previous Windows Azure SDK 1.8 and 2.0 releases. To install the Windows Azure SDK 2.1 on your local computer, choose the “install the sdk” link from the Windows Azure .NET Developer Center. Then, chose which version of Visual Studio you want to use it with.  Clicking the third link will install the SDK with the latest VS 2013 Preview: If you don’t already have the Visual Studio 2013 Preview installed on your machine, this will also install Visual Studio Express 2013 Preview for Web. Visual Studio 2013 VM Image Hosted in the Cloud One of the requests we’ve heard from several customers has been to have the ability to host Visual Studio within the cloud (avoiding the need to install anything locally on your computer). With today’s SDK update we’ve added a new VM image to the Windows Azure VM Gallery that has Visual Studio Ultimate 2013 Preview, SharePoint 2013, SQL Server 2012 Express and the Windows Azure 2.1 SDK already installed on it.  This provides a really easy way to create a development environment in the cloud with the latest tools. With the recent shutdown and suspend billing feature we shipped on Windows Azure last month, you can spin up the image only when you want to do active development, and then shut down the virtual machine and not have to worry about usage charges while the virtual machine is not in use. You can create your own VS image in the cloud by using the New->Compute->Virtual Machine->From Gallery menu within the Windows Azure Management Portal, and then by selecting the “Visual Studio Ultimate 2013 Preview” template: Visual Studio Server Explorer: Improved Filtering/Management of Subscription Resources With the Windows Azure SDK 2.1 release you’ll notice significant improvements in the Visual Studio Server Explorer. The explorer has been redesigned so that all Windows Azure services are now contained under a single Windows Azure node.  From the top level node you can now manage your Windows Azure credentials, import a subscription file or filter Server Explorer to only show services from particular subscriptions or regions. Note: The Web Sites and Mobile Services nodes will appear outside the Windows Azure Node until the final release of VS 2013. If you have installed the ASP.NET and Web Tools Preview Refresh, though, the Web Sites node will appear inside the Windows Azure node even with the VS 2013 Preview. Once your subscription information is added, Windows Azure services from all your subscriptions are automatically enumerated in the Server Explorer. You no longer need to manually add services to Server Explorer individually. This provides a convenient way of viewing all of your cloud services, storage accounts, service bus namespaces, virtual machines, and web sites from one location: Subscription and Region Filtering Support Using the Windows Azure node in Server Explorer, you can also now filter your Windows Azure services in the Server Explorer by the subscription or region they are in.  If you have multiple subscriptions but need to focus your attention to just a few subscription for some period of time, this a handy way to hide the services from other subscriptions view until they become relevant. You can do the same sort of filtering by region. To enable this, just select “Filter Services” from the context menu on the Windows Azure node: Then choose the subscriptions and/or regions you want to filter by. In the below example, I’ve decided to show services from my pay-as-you-go subscription within the East US region: Visual Studio will then automatically filter the items that show up in the Server Explorer appropriately: With storage accounts and service bus namespaces, you sometimes need to work with services outside your subscription. To accommodate that scenario, those services allow you to attach an external account (from the context menu). You’ll notice that external accounts have a slightly different icon in server explorer to indicate they are from outside your subscription. Other Improvements We’ve also improved the Server Explorer by adding additional properties and actions to the service exposed. You now have access to most of the properties on a cloud service, deployment slot, role or role instance as well as the properties on storage accounts, virtual machines and web sites. Just select the object of interest in Server Explorer and view the properties in the property pane. We also now have full support for creating/deleting/update storage tables, blobs and queues from directly within Server Explorer.  Simply right-click on the appropriate storage account node and you can create them directly within Visual Studio: Virtual Machines: Start/Stop within Visual Studio Virtual Machines now have context menu actions that allow you start, shutdown, restart and delete a Virtual Machine directly within the Visual Studio Server Explorer. The shutdown action enables you to shut down the virtual machine and suspend billing when the VM is not is use, and easily restart it when you need it: This is especially useful in Dev/Test scenarios where you can start a VM – such as a SQL Server – during your development session and then shut it down / suspend billing when you are not developing (and no longer be billed for it). You can also now directly remote desktop into VMs using the “Connect using Remote Desktop” context menu command in VS Server Explorer.  Cloud Services: Emulator Express with Run as Normal User Support You can now launch Visual Studio and run your cloud services locally as a Normal User (without having to elevate to an administrator account) using a new Emulator Express option included as a preview feature with this SDK release.  Emulator Express is a version of the Windows Azure Compute Emulator that runs a restricted mode – one instance per role – and it doesn’t require administrative permissions and uses 40% less resources than the full Windows Azure Emulator. Emulator Express supports both web and worker roles. To run your application locally using the Emulator Express option, simply change the following settings in the Windows Azure project. On the shortcut menu for the Windows Azure project, choose Properties, and then choose the Web tab. Check the setting for IIS (Internet Information Services). Make sure that the option is set to IIS Express, not the full version of IIS. Emulator Express is not compatible with full IIS. On the Web tab, choose the option for Emulator Express. Service Bus: Notification Hubs With the Windows Azure SDK 2.1 release we are adding support for Windows Azure Notification Hubs as part of our official Windows Azure SDK, inside of Microsoft.ServiceBus.dll (previously the Notification Hub functionality was in a preview assembly). You are now able to create, update and delete Notification Hubs programmatically, manage your device registrations, and send push notifications to all your mobile clients across all platforms (Windows Store, Windows Phone 8, iOS, and Android). Learn more about Notification Hubs on MSDN here, or watch the Notification Hubs //BUILD/ presentation here. Service Bus: Paired Namespaces One of the new features included with today’s Windows Azure SDK 2.1 release is support for Service Bus “Paired Namespaces”.  Paired Namespaces enable you to better handle situations where a Service Bus service namespace becomes unavailable (for example: due to connectivity issues or an outage) and you are unable to send or receive messages to the namespace hosting the queue, topic, or subscription. Previously,to handle this scenario you had to manually setup separate namespaces that can act as a backup, then implement manual failover and retry logic which was sometimes tricky to get right. Service Bus now supports Paired Namespaces, which enables you to connect two namespaces together. When you activate the secondary namespace, messages are stored in the secondary queue for delivery to the primary queue at a later time. If the primary container (namespace) becomes unavailable for some reason, automatic failover enables the messages in the secondary queue. For detailed information about paired namespaces and high availability, see the new topic Asynchronous Messaging Patterns and High Availability. Service Bus: Tooling Improvements In this release, the Windows Azure Tools for Visual Studio contain several enhancements and changes to the management of Service Bus messaging entities using Visual Studio’s Server Explorer. The most noticeable change is that the Service Bus node is now integrated into the Windows Azure node, and supports integrated subscription management. Additionally, there has been a change to the code generated by the Windows Azure Worker Role with Service Bus Queue project template. This code now uses an event-driven “message pump” programming model using the QueueClient.OnMessage method. PowerShell: Tons of New Automation Commands Since my last blog post on the previous Windows Azure SDK 2.0 release, we’ve updated Windows Azure PowerShell (which is a separate download) five times. You can find the full change log here. We’ve added new cmdlets in the following areas: China instance and Windows Azure Pack support Environment Configuration VMs Cloud Services Web Sites Storage SQL Azure Service Bus China Instance and Windows Azure Pack We now support the following cmdlets for the China instance and Windows Azure Pack, respectively: China Instance: Web Sites, Service Bus, Storage, Cloud Service, VMs, Network Windows Azure Pack: Web Sites, Service Bus We will have full cmdlet support for these two Windows Azure environments in PowerShell in the near future. Virtual Machines: Stop/Start Virtual Machines Similar to the Start/Stop VM capability in VS Server Explorer, you can now stop your VM and suspend billing: If you want to keep the original behavior of keeping your stopped VM provisioned, you can pass in the -StayProvisioned switch parameter. Virtual Machines: VM endpoint ACLs We’ve added and updated a bunch of cmdlets for you to configure fine-grained network ACL on your VM endpoints. You can use the following cmdlets to create ACL config and apply them to a VM endpoint: New-AzureAclConfig Get-AzureAclConfig Set-AzureAclConfig Remove-AzureAclConfig Add-AzureEndpoint -ACL Set-AzureEndpoint –ACL The following example shows how to add an ACL rule to an existing endpoint of a VM. Other improvements for Virtual Machine management includes Added -NoWinRMEndpoint parameter to New-AzureQuickVM and Add-AzureProvisioningConfig to disable Windows Remote Management Added -DirectServerReturn parameter to Add-AzureEndpoint and Set-AzureEndpoint to enable/disable direct server return Added Set-AzureLoadBalancedEndpoint cmdlet to modify load balanced endpoints Cloud Services: Remote Desktop and Diagnostics Remote Desktop and Diagnostics are popular debugging options for Cloud Services. We’ve introduced cmdlets to help you configure these two Cloud Service extensions from Windows Azure PowerShell. Windows Azure Cloud Services Remote Desktop extension: New-AzureServiceRemoteDesktopExtensionConfig Get-AzureServiceRemoteDesktopExtension Set-AzureServiceRemoteDesktopExtension Remove-AzureServiceRemoteDesktopExtension Windows Azure Cloud Services Diagnostics extension New-AzureServiceDiagnosticsExtensionConfig Get-AzureServiceDiagnosticsExtension Set-AzureServiceDiagnosticsExtension Remove-AzureServiceDiagnosticsExtension The following example shows how to enable Remote Desktop for a Cloud Service. Web Sites: Diagnostics With our last SDK update, we introduced the Get-AzureWebsiteLog –Tail cmdlet to get the log streaming of your Web Sites. Recently, we’ve also added cmdlets to configure Web Site application diagnostics: Enable-AzureWebsiteApplicationDiagnostic Disable-AzureWebsiteApplicationDiagnostic The following 2 examples show how to enable application diagnostics to the file system and a Windows Azure Storage Table: SQL Database Previously, you had to know the SQL Database server admin username and password if you want to manage the database in that SQL Database server. Recently, we’ve made the experience much easier by not requiring the admin credential if the database server is in your subscription. So you can simply specify the -ServerName parameter to tell Windows Azure PowerShell which server you want to use for the following cmdlets. Get-AzureSqlDatabase New-AzureSqlDatabase Remove-AzureSqlDatabase Set-AzureSqlDatabase We’ve also added a -AllowAllAzureServices parameter to New-AzureSqlDatabaseServerFirewallRule so that you can easily add a firewall rule to whitelist all Windows Azure IP addresses. Besides the above experience improvements, we’ve also added cmdlets get the database server quota and set the database service objective. Check out the following cmdlets for details. Get-AzureSqlDatabaseServerQuota Get-AzureSqlDatabaseServiceObjective Set-AzureSqlDatabase –ServiceObjective Storage and Service Bus Other new cmdlets include Storage: CRUD cmdlets for Azure Tables and Queues Service Bus: Cmdlets for managing authorization rules on your Service Bus Namespace, Queue, Topic, Relay and NotificationHub Summary Today’s release includes a bunch of great features that enable you to build even better cloud solutions.  All the above features/enhancements are shipped and available to use immediately as part of the 2.1 release of the Windows Azure SDK for .NET. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Cloud 9 and Grunt.js

    - by Michael Ryan Soileau
    I'm running grunt.js on Cloud9. Most everything is working correctly, except when I try to set this option: 'watch: { options: {livereload:true},' If I add that, the terminal states: 'Fatal error: listen EACCES' I'm guessing I need to use the sudo command to run that and since c9 doesn't let you run sudo, the command fails. But why is livereload a feature that requires permission? And is there any way around it?

    Read the article

  • PHP ODBC MDB Access on a Cloud Server

    - by Senica Gonzalez
    Hey! Hopefully quick question.... I have a .MDB file stored on my webserver and I'm trying to connect to it. I have no way of "registering" it with a name in ODBC. Is the only way to connect to it by specifying the absolute page of the .mdb file? $mdbFilename = "./db/Scora.mdb"; $connection = odbc_connect("Driver={Microsoft Access Driver (*.mdb)};Dbq=$mdbFilename","",""); if (!$connection) { echo "Couldn't make a connection!"; } $sql = "SELECT ID FROM ScoraRegistrations"; $sql_result = odbc_prepare($connection,$sql); odbc_execute($sql_result); odbc_result_all($sql_result,"border=1"); odbc_free_result($sql_result); odbc_close($connection); It never connects. Any thoughts?

    Read the article

  • Remote Backup User Data on iPhone

    - by Eric
    I wrote a few iPhone apps using Core Data for persistent storage. Everything is working great but I would like to add the ability for users to back up their data to a PC (via WiFi to a PC app) or to a web server. This is new to me and I can't seem to figure out where to begin researching the problem. I don't want to overcomplicate the issue if there is an easy way to implement this. Is anyone familiar enough with what I am looking to do to point me in the right direction or give me a high level overview of what I should be considering? The data is all text and would be perfectly stored in .csv files if that matters.

    Read the article

  • vss intializefor backup fails with return code E_UNEXPECTED

    - by suresh
    #include "vss.h" #include "vswriter.h" #include <VsBackup.h> #include <stdio.h> #define CHECK_PRINT(result) printf("%s\n",result==S_OK?"S_OK":"error") int main(int argc, char* argv[]) { BSTR xml; LPTSTR errorText; IVssBackupComponents *VssHandle; HRESULT result = CreateVssBackupComponents(&VssHandle); CHECK_PRINT(result); result = VssHandle->InitializeForBackup(); printf("unexpected%x\n",result); system("pause"); return 0; } in the above program intializeforbackup fails with error code E_UNEXPECTED. The VSS service is running . In the event log it shows as "Volume Shadow Copy Service error: Unexpected error calling routine CoCreateInstance. hr = 0x800401f0.".. Any solutions for the InitializeForBackup to return S_OK?

    Read the article

  • How to backup database to disk using JPA?

    - by Nitesh Panchal
    Hello, Which query to write in JPQL for backing up database on disk? If in JPQL it's not available even native sql query will do. Also, i would like to bring one issue in front of stackoverflow developers :- This site doesn't properly work in Opera (Opera 9.63). Whenever i write question and click "Post Your question" The button click event doesn't fire at all, may be, the server side event doesn't fire or something. However, no such problem comes in IE and firefox.

    Read the article

  • Recover backup copy of a ubuntu linux installation on a usb stick using dd

    - by Werner
    Hi, I installed Ubuntu 10.04 on a usb stick in persistent install mode. So I could boot the laptop or my desktop computer with the stick, at boot time. Once I needed the 8GB stick for another purposes so I thought about coyping it to my desktop doing from mac os x: dd if=/dev/disks3s of=/Users/jack/Desktop/usb_copy Now I am trying to do the opposite, after having used the stick, which was formatted to NTFS, just doing dd if=/Users/jack/Desktop/usb_copy of=/dev/disks3s but although I can see that almost of the files are there, I can not boot again. IT is also strange the the file permissions are kind of strange, something like _user What can I do ? Thanks

    Read the article

  • Glob function (c) and backup file (file~)

    - by arnoras
    I'm using glob function for a autocompletion function. I'm showing you the problem because it's difficult to explain: matched = ~/.tcsh glob(matched, 0, NULL, &pglob); glob put all matched files in a char ** and when I print it I have: case[0] = .tcshrc case[1] = I should have .tcshrc~ in case[1], but nothing =S, I've seen a flag "GLOB_TILDE" like this " glob(matched, GLOB_TILDE, NULL, &pglob); But it doesn't change anything! Can someone help me?

    Read the article

  • Android GPS cloud of confusion!

    - by Anthony Forloney
    I am trying to design my first Android application with the use of GPS. As of right now, I have a drawable button that when clicked, alerts a Toast message of the longitude and latitude. I have tried to use the telnet localhost 5554 and then geo fix #number #number to feed in values but no results display just 0 0. I have also tried DDMS way of sending GPS coordinates and I get the same thing. My question is what exactly is the code equivalent to the geo fix and the DDMS way of sending coordinates. I have used Location, LocationManger and LocationListener but I am not sure which is the right choice. Could anyone explain to me what the code-equivalent just so I can get a better understanding of how to fix my application not working. Code is given, just in case if the error exists with the code @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); Button button = (Button) findViewById(R.id.track); button.setOnClickListener(this); LocationManager location =(LocationManager)getSystemService(Context.LOCATION_SERVICE); Location loc = location.getLastKnownLocation(location.GPS_PROVIDER); updateWithNewLocation(loc); } private final LocationListener locationListener = new LocationListener() { public void onLocationChanged(Location location) { updateWithNewLocation(location); } private void updateWithNewLocation(Location l) { longitude = l.getLongitude(); latitude = l.getLatitude(); provider = l.getProvider(); } public void onClick(View v) { Toast.makeText(this, "Your location is " + longitude + " and " + latitude + " provided by: " + provider, Toast.LENGTH_SHORT).show(); } }

    Read the article

  • Mail in the cloud? (web service for email)

    - by bambax
    I have multiple websites that need to send mail (after successful signup, etc.); I maintain a Postfix instance just for this (it never receives any incoming mail). This Postfix instance is a pain: it is sometimes broken, is regularly attacked, produces enormous amounts of logs, etc. These problems could certainly be dealt with by an experienced Postfix admin, but that's not me. Is there a service that would be just like Twilio, but for email? IE: a web service for sending mail.

    Read the article

  • Can't seem to sign AWS Cloud Front URL properly

    - by Joe Corkery
    Hi everybody, I found a lot of detailed examples online on how to sign an Amazon CloudFront URL for private content. Unfortunately, whenever I implement these examples my URL doesn't seem to work. The resource path is correct because I can download the file when it is set for world read, but the URL doesn't work when set just for authorized users. The PHP code I am using is below. If anybody has any insights as to what I might be doing wrong (I'm guessing it is something obvious that I am just not seeing right now), it would be greatly appreciated. function urlCloudFront($resource) { $AWS_CF_KEY = 'APKA...'; $priv_key = file_get_contents(path_to_pem_file); $pkeyid = openssl_get_privatekey($priv_key); $expires = strtotime("+ 3 hours"); $policy_str = '{"Statement":[{"Resource":"'.$resource.'","Condition":{"DateLessThan":{"AWS:EpochTime":'.$expires.'}}}]}'; $policy_str = trim( preg_replace( '/\s+/', '', $policy_str ) ); $res = openssl_sign($policy_str, $signature, $pkeyid, OPENSSL_ALGO_SHA1); $signature_base64 = (base64_encode($signature)); $repl = array('+' => '-','=' => '_','/' => '~'); $signature_base64 = strtr($signature_base64,$repl); $url = $resource . '?Expires=' .$expires. '&Signature=' . $signature_base64 . '&Key-Pair-Id='. $AWS_CF_KEY; print '<p><A href="' .$url. '">Download VIDA (CloudFrount)</A>'; } urlCloudFront("http://mydistcloud.cloudfront.net/mydir/myfile.tar.gz"); Thanks.

    Read the article

< Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >