Search Results

Search found 20140 results on 806 pages for 'remote management'.

Page 93/806 | < Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >

  • Pushing a local mercurial repository to a remote server or cloning at server from local

    - by Samaursa
    I have a local repository that I have now decided to push to a remote server (for example, I have a host that allows mercurial repositories and I am also trying to push to bitbucket). The repository has a lot of files and is a little more than 200mb. Locally, I am able to clone the repository without problems. Now I have a lot of changes in this repository, and I have wasted a couple of days trying to figure out how to get the remote server to clone my repository. I cannot get hg serve to work outside of the LAN. I have tried everything. So instead, I created a new repository at the remote servers (both at the host and bitbucket) with nothing in it. Now I am pushing the complete repository that I have locally to these remote locations. So far it has been unsuccessful, as the push operation is stuck on searching for changes and does not give me any other useful output. I have let it go for about an hour with no change. Now my questions is, what am I doing wrong as far as hg serve is concerned? I can access it locally but not remotely (through DynDns - I have configured it properly and the router forwards the ports correctly) so that I can get the server to clone the repository the first time after which I will be pushing to it. My second question is, assuming the clone at server does not work (for example, if I was to push my current repository to bitbucket), is creating an empty repository at the server and then pushing a local repository to the new remote repository ok? Is that the source of the searching for changes problem? Any help in this regard would be greatly appreciated.

    Read the article

  • Secure Copy File from remote server via scp and os module in Python

    - by user1063572
    I'm pretty new to Python and programming. I'm trying to copy a file between two computers via a python script. However the code os.system("ssh " + hostname + " scp " + filepath + " " + user + "@" + localhost + ":" cwd) won't work. I think it needs a password, as descriped in How do I copy a file to a remote server in python using scp or ssh?. I didn't get any error logs, the file just won't show in my current working directory. However every other command with os.system("ssh " + hostname + "command") or os.popen("ssh " + hostname + "command") does work. - command = e.g. ls When I try ssh hostname scp file user@local:directory in the commandline it works without entering a password. I tried to combine os.popen commands with getpass and pxssh module to establish a ssh connection to the remote server and use it to send commands directly (I only tested it for an easy command): import pxssh import getpass ssh = pxssh.pxssh() ssh.force_password = True hostname = raw_input("Hostname: ") user = raw_input("Username: ") password = getpass.getpass("Password: ") ssh.login(hostname, user, password) test = os.popen("hostname") print test But I'm not able to put commands through to the remote server (print test shows, that hostname = local and not the remote server), however I'm sure, the conection is established. I thought it would be easier to establish a connection than always use "ssh " + hostname in the bash commands. I also tried some of the workarounds in How do I copy a file to a remote server in python using scp or ssh?, but I must admit due to lack of expirience I didn't get them to work. Thanks a lot for helping me.

    Read the article

  • Retrieve remote XML data with PHP

    - by rrrfusco
    I'm currently reading a regularly updated XML file with PHP (simpleXML). I'd like to reduce calls to the remote server by reading a cache file on my web server, then after some time, retrieving a new copy of the remote file. Is this an accepted practice for reading remote XML files, then parsing? Can anyone offer some suggestions on how to go about this in PHP, or perhaps there are some PEAR classes that deal with this?

    Read the article

  • Remote servicve

    - by Moshik
    Hi, I have preperd a remote service, i would like to know if it's possible somehow via this service to call another activities? coz usually what ive seen so far, is how external apps calling the remote service and using it's methods, but i want vice versa.. that the remote service will call an external app's activity, thanks, moshik.

    Read the article

  • What things must I know about OpenAL memory management?

    - by mystify
    I am playing sound with OpenAL, and it seems to increase memory footprint dramatically for every little sound I play. It seems that OpenAL never frees memory itself and that playing a Source causes memory footprint to grow. I couldn't find any good resources about OpenAL memory management, but I bet I must do a lot of stuff myself. Maybe someone knows a ressource for that?

    Read the article

  • How do I view executed queries within SQL Server Management Studio?

    - by Brandon
    I am new to SQL Server Management Studio and am wondering if there is a way to see what queries have been ran against a database. Surely there is a way to see these. In the Activity monitor, there is a "Recent Expensive Queries" but I'm guessing that isn't all of the queries since I'm not seeing the ones I have ran. I am running SQL Server 2008 v 10.0.1600.22.

    Read the article

  • We are able to connect to remote desktop but randomly keyboard does not work in remote machine. Wher

    - by user340071
    Description: The project deals with interaction with the remote desktop connecting through web browser (Internet Explorer). What we did: We have created an activex which installs through the web browser into client machine which lets the user to connect to different servers through web browser. What is in Activex In Activex we used an MSTSC.lib file and by passing the relevant parameters to it. It connects to the remote Desktop. What are the Problems facing now? We are able to connect to remote desktop but randomly keyboard does not work in remote machine. Where as mouse works perfectly.

    Read the article

  • How do I back up a remote SVN repository

    - by Roaders
    Hi all I am currently moving my SVN server from my home server to my remote server so I can access it more easily from other locations. My remote server is not backed up so I want to regularly back it up to my home server. Remote server is Windows 2003 server. Home server is Windows Home Server. What is the best way to do this? can I get my home server to get a dump of the remote server every night? Bandwidth isn't a huge consideration but if I could just copy any new checkins to an SVN server on my home server that would be fine. Any suggestions welcome. Thanks

    Read the article

  • DrawImage in XP Mode or Remote Desktop

    - by simplecoder
    I'm displaying a PNG with a transparent background that looks good in Windows 7, but then I run my app in XP Mode or remote desktop to a Windows XP machine and the PNG looks incorrect. I noticed that if I disable "Integration Mode" or run the app on XP without remote desktop, the image looks fine. How do I get DrawImage to render the PNG correctly in XP Mode or remote desktop? Image inside Windows 7 Image inside XP Mode or remote desktop Here's my code: protected override void OnPaint(PaintEventArgs e) { Image image = Image.FromFile("hello.png", false); Bitmap bmp = new Bitmap(image); Rectangle destRect = new Rectangle(0, 0, image.Width, image.Height); e.Graphics.DrawImage(image, destRect, 0, 0, image.Width, image.Height, GraphicsUnit.Pixel); base.OnPaint(e); }

    Read the article

  • git push on a remote branch

    - by charlielee
    I have a remote project that have a branch. So I first clone the repo. Then issue the following to the clone to work on a branch: git checkout -b <name> <remote_branch_name> Then I made the changed needed on this branch and want to commit by doing this: git commit -a -m "changed made" However when i want to push back to the remote branch it just say 'Everything is up to date' git push Everything up-to-date I check by clone the remote repo again in a different directory it haven't push the changes over.... So how do i push my changes back to the remote branch Thanks

    Read the article

  • Unable to Connect to Management Studio Server

    - by Phil Hilliard
    I have a nasty situation. I am using Microsoft SQL Server Management Studio Express edition locally on my pc for testing, and once tested I upload database changes to a remote server. I have a situation where I deleted the Default Database on my local machine, and instead of searching hard enough to find an answer to that problem, I uninstalled and reinstalled Management Studio. Since then Management Studio has not been able to connect to the server. Is there any help (or hope for me for that matter), out there????? The following is the detailed error message: =================================== Cannot connect to LENOVO-E7A54767\SQLEXPRESS. =================================== A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (.Net SqlClient Data Provider) ------------------------------ For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=-1&LinkId=20476 ------------------------------ Error Number: -1 Severity: 20 State: 0 ------------------------------ Program Location: at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Connect(ServerInfo serverInfo, SqlInternalConnectionTds connHandler, Boolean ignoreSniOpenTimeout, Int64 timerExpire, Boolean encrypt, Boolean trustServerCert, Boolean integratedSecurity, SqlConnection owningObject) at System.Data.SqlClient.SqlInternalConnectionTds.AttemptOneLogin(ServerInfo serverInfo, String newPassword, Boolean ignoreSniOpenTimeout, Int64 timerExpire, SqlConnection owningObject) at System.Data.SqlClient.SqlInternalConnectionTds.LoginNoFailover(String host, String newPassword, Boolean redirectedUserInstance, SqlConnection owningObject, SqlConnectionString connectionOptions, Int64 timerStart) at System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(SqlConnection owningObject, SqlConnectionString connectionOptions, String newPassword, Boolean redirectedUserInstance) at System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, Object providerInfo, String newPassword, SqlConnection owningObject, Boolean redirectedUserInstance) at System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection) at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup) at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) at System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) at System.Data.SqlClient.SqlConnection.Open() at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.ObjectExplorer.ValidateConnection(UIConnectionInfo ci, IServerType server) at Microsoft.SqlServer.Management.UI.ConnectionDlg.Connector.ConnectionThreadUser()

    Read the article

  • I can't use achiva as a proxy of my remote repository

    - by user1553915
    I want to install Achiva to manage my Maven repositories. I add a new internal repository called "public" and a new remote repository, which is my company's repository. Then I configure the proxy connector to let the repository "public" to be a proxy of the remote one. But When I enter the address "http://localhost:8084/archiva/repository/public/.../....pom" in the web brower,error occurred "Unable to fetch artifact resource". However, if I replace the remote repository with another one such as "http://repository.codehaus.org/", the proxy works. I don't know why this happens, are there something wrong with my remote repository address?

    Read the article

  • How to send file to remote computer?

    - by Phsika
    i can get file name via below codes. How can i send this file to remote computer. this remote computer ip: 192.168.2.105 also i can use 51124 port class Program { static void Main(string[] args) { string[] dosyalarinYollari = System.IO.Directory.GetFiles(@"z:\20071008\1.2.392.200036.9116.2.6.1.48.1215563310.1191800303.305777\", "*.dcm", System.IO.SearchOption.AllDirectories); foreach (string s in dosyalarinYollari) { Console.Write(s+"\n"); // i need to send tihs s file to remote machine } Console.ReadKey(); } }

    Read the article

  • Setup a git external for remote repo

    - by Tom
    I'd create a repo which pulls in a remote repo, let's say I'd like to setup jQuery as a submodule for example git://github.com/jquery/jquery.git What would be the process of creating a repo with jQuery as a submodule and adding my own external as a remote repo. Also once this is setup, if I push / pull to my own remote, will the external remain intact?

    Read the article

  • send xml data from j2me app to remote server

    - by pi
    Hi, I have a J2ME application - which generates XML data (from an object) and needs to get it sent across to a remote server/machine. How do I get this xml or the object through to the remote server please? Thanks. Just in case it might help, the receiving application on the remote server is PHP powered. Thanks again.

    Read the article

  • How can i modify remote machine input.properties file from web page in C#.Net

    - by picnic4u
    i have build script in remote machine.but i want to start the build from my local machine.so for this i need to update the input.properties file in remote machine and then run the batch file to start the build process. For this i have created one web page so how can i modify the remote input.properties file and run the batch file in C#. please give me some suggestion for this. thanks in advance...

    Read the article

  • Windows Azure: Import/Export Hard Drives, VM ACLs, Web Sockets, Remote Debugging, Continuous Delivery, New Relic, Billing Alerts and More

    - by ScottGu
    Two weeks ago we released a giant set of improvements to Windows Azure, as well as a significant update of the Windows Azure SDK. This morning we released another massive set of enhancements to Windows Azure.  Today’s new capabilities include: Storage: Import/Export Hard Disk Drives to your Storage Accounts HDInsight: General Availability of our Hadoop Service in the cloud Virtual Machines: New VM Gallery, ACL support for VIPs Web Sites: WebSocket and Remote Debugging Support Notification Hubs: Segmented customer push notification support with tag expressions TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services Developer Analytics: New Relic support for Web Sites + Mobile Services Service Bus: Support for partitioned queues and topics Billing: New Billing Alert Service that sends emails notifications when your bill hits a threshold you define All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them. Storage: Import/Export Hard Disk Drives to Windows Azure I am excited to announce the preview of our new Windows Azure Import/Export Service! The Windows Azure Import/Export Service enables you to move large amounts of on-premises data into and out of your Windows Azure Storage accounts. It does this by enabling you to securely ship hard disk drives directly to our Windows Azure data centers. Once we receive the drives we’ll automatically transfer the data to or from your Windows Azure Storage account.  This enables you to import or export massive amounts of data more quickly and cost effectively (and not be constrained by available network bandwidth). Encrypted Transport Our Import/Export service provides built-in support for BitLocker disk encryption – which enables you to securely encrypt data on the hard drives before you send it, and not have to worry about it being compromised even if the disk is lost/stolen in transit (since the content on the transported hard drives is completely encrypted and you are the only one who has the key to it).  The drive preparation tool we are shipping today makes setting up bitlocker encryption on these hard drives easy. How to Import/Export your first Hard Drive of Data You can read our Getting Started Guide to learn more about how to begin using the import/export service.  You can create import and export jobs via the Windows Azure Management Portal as well as programmatically using our Server Management APIs. It is really easy to create a new import or export job using the Windows Azure Management Portal.  Simply navigate to a Windows Azure storage account, and then click the new Import/Export tab now available within it (note: if you don’t have this tab make sure to sign-up for the Import/Export preview): Then click the “Create Import Job” or “Create Export Job” commands at the bottom of it.  This will launch a wizard that easily walks you through the steps required: For more comprehensive information about Import/Export, refer to Windows Azure Storage team blog.  You can also send questions and comments to the [email protected] email address. We think you’ll find this new service makes it much easier to move data into and out of Windows Azure, and it will dramatically cut down the network bandwidth required when working on large data migration projects.  We hope you like it. HDInsight: 100% Compatible Hadoop Service in the Cloud Last week we announced the general availability release of Windows Azure HDInsight. HDInsight is a 100% compatible Hadoop service that allows you to easily provision and manage Hadoop clusters for big data processing in Windows Azure.  This release is now live in production, backed by an enterprise SLA, supported 24x7 by Microsoft Support, and is ready to use for production scenarios. HDInsight allows you to use Apache Hadoop tools, such as Pig and Hive, to process large amounts of data in Windows Azure Blob Storage. Because data is stored in Windows Azure Blob Storage, you can choose to dynamically create Hadoop clusters only when you need them, and then shut them down when they are no longer required (since you pay only for the time the Hadoop cluster instances are running this provides a super cost effective way to use them).  You can create Hadoop clusters using either the Windows Azure Management Portal (see below) or using our PowerShell and Cross Platform Command line tools: The import/export hard drive support that came out today is a perfect companion service to use with HDInsight – the combination allows you to easily ingest, process and optionally export a limitless amount of data.  We’ve also integrated HDInsight with our Business Intelligence tools, so users can leverage familiar tools like Excel in order to analyze the output of jobs.  You can find out more about how to get started with HDInsight here. Virtual Machines: VM Gallery Enhancements Today’s update of Windows Azure brings with it a new Virtual Machine gallery that you can use to create new VMs in the cloud.  You can launch the gallery by doing New->Compute->Virtual Machine->From Gallery within the Windows Azure Management Portal: The new Virtual Machine Gallery includes some nice enhancements that make it even easier to use: Search: You can now easily search and filter images using the search box in the top-right of the dialog.  For example, simply type “SQL” and we’ll filter to show those images in the gallery that contain that substring. Category Tree-view: Each month we add more built-in VM images to the gallery.  You can continue to browse these using the “All” view within the VM Gallery – or now quickly filter them using the category tree-view on the left-hand side of the dialog.  For example, by selecting “Oracle” in the tree-view you can now quickly filter to see the official Oracle supplied images. MSDN and Supported checkboxes: With today’s update we are also introducing filters that makes it easy to filter out types of images that you may not be interested in. The first checkbox is MSDN: using this filter you can exclude any image that is not part of the Windows Azure benefits for MSDN subscribers (which have highly discounted pricing - you can learn more about the MSDN pricing here). The second checkbox is Supported: this filter will exclude any image that contains prerelease software, so you can feel confident that the software you choose to deploy is fully supported by Windows Azure and our partners. Sort options: We sort gallery images by what we think customers are most interested in, but sometimes you might want to sort using different views. So we’re providing some additional sort options, like “Newest,” to customize the image list for what suits you best. Pricing information: We now provide additional pricing information about images and options on how to cost effectively run them directly within the VM Gallery. The above improvements make it even easier to use the VM Gallery and quickly create launch and run Virtual Machines in the cloud. Virtual Machines: ACL Support for VIPs A few months ago we exposed the ability to configure Access Control Lists (ACLs) for Virtual Machines using Windows PowerShell cmdlets and our Service Management API. With today’s release, you can now configure VM ACLs using the Windows Azure Management Portal as well. You can now do this by clicking the new Manage ACL command in the Endpoints tab of a virtual machine instance: This will enable you to configure an ordered list of permit and deny rules to scope the traffic that can access your VM’s network endpoints. For example, if you were on a virtual network, you could limit RDP access to a Windows Azure virtual machine to only a few computers attached to your enterprise. Or if you weren’t on a virtual network you could alternatively limit traffic from public IPs that can access your workloads: Here is the default behaviors for ACLs in Windows Azure: By default (i.e. no rules specified), all traffic is permitted. When using only Permit rules, all other traffic is denied. When using only Deny rules, all other traffic is permitted. When there is a combination of Permit and Deny rules, all other traffic is denied. Lastly, remember that configuring endpoints does not automatically configure them within the VM if it also has firewall rules enabled at the OS level.  So if you create an endpoint using the Windows Azure Management Portal, Windows PowerShell, or REST API, be sure to also configure your guest VM firewall appropriately as well. Web Sites: Web Sockets Support With today’s release you can now use Web Sockets with Windows Azure Web Sites.  This feature enables you to easily integrate real-time communication scenarios within your web based applications, and is available at no extra charge (it even works with the free tier).  Higher level programming libraries like SignalR and socket.io are also now supported with it. You can enable Web Sockets support on a web site by navigating to the Configure tab of a Web Site, and by toggling Web Sockets support to “on”: Once Web Sockets is enabled you can start to integrate some really cool scenarios into your web applications.  Check out the new SignalR documentation hub on www.asp.net to learn more about some of the awesome scenarios you can do with it. Web Sites: Remote Debugging Support The Windows Azure SDK 2.2 we released two weeks ago introduced remote debugging support for Windows Azure Cloud Services. With today’s Windows Azure release we are extending this remote debugging support to also work with Windows Azure Web Sites. With live, remote debugging support inside of Visual Studio, you are able to have more visibility than ever before into how your code is operating live in Windows Azure. It is now super easy to attach the debugger and quickly see what is going on with your application in the cloud. Remote Debugging of a Windows Azure Web Site using VS 2013 Enabling the remote debugging of a Windows Azure Web Site using VS 2013 is really easy.  Start by opening up your web application’s project within Visual Studio. Then navigate to the “Server Explorer” tab within Visual Studio, and click on the deployed web-site you want to debug that is running within Windows Azure using the Windows Azure->Web Sites node in the Server Explorer.  Then right-click and choose the “Attach Debugger” option on it: When you do this Visual Studio will remotely attach the debugger to the Web Site running within Windows Azure.  The debugger will then stop the web site’s execution when it hits any break points that you have set within your web application’s project inside Visual Studio.  For example, below I set a breakpoint on the “ViewBag.Message” assignment statement within the HomeController of the standard ASP.NET MVC project template.  When I hit refresh on the “About” page of the web site within the browser, the breakpoint was triggered and I am now able to debug the app remotely using Visual Studio: Note above how we can debug variables (including autos/watchlist/etc), as well as use the Immediate and Command Windows. In the debug session above I used the Immediate Window to explore some of the request object state, as well as to dynamically change the ViewBag.Message property.  When we click the the “Continue” button (or press F5) the app will continue execution and the Web Site will render the content back to the browser.  This makes it super easy to debug web apps remotely. Tips for Better Debugging To get the best experience while debugging, we recommend publishing your site using the Debug configuration within Visual Studio’s Web Publish dialog. This will ensure that debug symbol information is uploaded to the Web Site which will enable a richer debug experience within Visual Studio.  You can find this option on the Web Publish dialog on the Settings tab: When you ultimately deploy/run the application in production we recommend using the “Release” configuration setting – the release configuration is memory optimized and will provide the best production performance.  To learn more about diagnosing and debugging Windows Azure Web Sites read our new Troubleshooting Windows Azure Web Sites in Visual Studio guide. Notification Hubs: Segmented Push Notification support with tag expressions In August we announced the General Availability of Windows Azure Notification Hubs - a powerful Mobile Push Notifications service that makes it easy to send high volume push notifications with low latency from any mobile app back-end.  Notification hubs can be used with any mobile app back-end (including ones built using our Mobile Services capability) and can also be used with back-ends that run in the cloud as well as on-premises. Beginning with the initial release, Notification Hubs allowed developers to send personalized push notifications to both individual users as well as groups of users by interest, by associating their devices with tags representing the logical target of the notification. For example, by registering all devices of customers interested in a favorite MLB team with a corresponding tag, it is possible to broadcast one message to millions of Boston Red Sox fans and another message to millions of St. Louis Cardinals fans with a single API call respectively. New support for using tag expressions to enable advanced customer segmentation With today’s release we are adding support for even more advanced customer targeting.  You can now identify customers that you want to send push notifications to by defining rich tag expressions. With tag expressions, you can now not only broadcast notifications to Boston Red Sox fans, but take that segmenting a step farther and reach more granular segments. This opens up a variety of scenarios, for example: Offers based on multiple preferences—e.g. send a game day vegetarian special to users tagged as both a Boston Red Sox fan AND a vegetarian Push content to multiple segments in a single message—e.g. rain delay information only to users who are tagged as either a Boston Red Sox fan OR a St. Louis Cardinal fan Avoid presenting subsets of a segment with irrelevant content—e.g. season ticket availability reminder to users who are tagged as a Boston Red Sox fan but NOT also a season ticket holder To illustrate with code, consider a restaurant chain app that sends an offer related to a Red Sox vs Cardinals game for users in Boston. Devices can be tagged by your app with location tags (e.g. “Loc:Boston”) and interest tags (e.g. “Follows:RedSox”, “Follows:Cardinals”), and then a notification can be sent by your back-end to “(Follows:RedSox || Follows:Cardinals) && Loc:Boston” in order to deliver an offer to all devices in Boston that follow either the RedSox or the Cardinals. This can be done directly in your server backend send logic using the code below: var notification = new WindowsNotification(messagePayload); hub.SendNotificationAsync(notification, "(Follows:RedSox || Follows:Cardinals) && Loc:Boston"); In your expressions you can use all Boolean operators: AND (&&), OR (||), and NOT (!).  Some other cool use cases for tag expressions that are now supported include: Social: To “all my group except me” - group:id && !user:id Events: Touchdown event is sent to everybody following either team or any of the players involved in the action: Followteam:A || Followteam:B || followplayer:1 || followplayer:2 … Hours: Send notifications at specific times. E.g. Tag devices with time zone and when it is 12pm in Seattle send to: GMT8 && follows:thaifood Versions and platforms: Send a reminder to people still using your first version for Android - version:1.0 && platform:Android For help on getting started with Notification Hubs, visit the Notification Hub documentation center.  Then download the latest NuGet package (or use the Notification Hubs REST APIs directly) to start sending push notifications using tag expressions.  They are really powerful and enable a bunch of great new scenarios. TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services With today’s Windows Azure release we are making it really easy to enable continuous delivery support with Windows Azure and Team Foundation Services.  Team Foundation Services is a cloud based offering from Microsoft that provides integrated source control (with both TFS and Git support), build server, test execution, collaboration tools, and agile planning support.  It makes it really easy to setup a team project (complete with automated builds and test runners) in the cloud, and it has really rich integration with Visual Studio. With today’s Windows Azure release it is now really easy to enable continuous delivery support with both TFS and Git based repositories hosted using Team Foundation Services.  This enables a workflow where when code is checked in, built successfully on an automated build server, and all tests pass on it – I can automatically have the app deployed on Windows Azure with zero manual intervention or work required. The below screen-shots demonstrate how to quickly setup a continuous delivery workflow to Windows Azure with a Git-based ASP.NET MVC project hosted using Team Foundation Services. Enabling Continuous Delivery to Windows Azure with Team Foundation Services The project I’m going to enable continuous delivery with is a simple ASP.NET MVC project whose source code I’m hosting using Team Foundation Services.  I did this by creating a “SimpleContinuousDeploymentTest” repository there using Git – and then used the new built-in Git tooling support within Visual Studio 2013 to push the source code to it.  Below is a screen-shot of the Git repository hosted within Team Foundation Services: I can access the repository within Visual Studio 2013 and easily make commits with it (as well as branch, merge and do other tasks).  Using VS 2013 I can also setup automated builds to take place in the cloud using Team Foundation Services every time someone checks in code to the repository: The cool thing about this is that I don’t have to buy or rent my own build server – Team Foundation Services automatically maintains its own build server farm and can automatically queue up a build for me (for free) every time someone checks in code using the above settings.  This build server (and automated testing) support now works with both TFS and Git based source control repositories. Connecting a Team Foundation Services project to Windows Azure Once I have a source repository hosted in Team Foundation Services with Automated Builds and Testing set up, I can then go even further and set it up so that it will be automatically deployed to Windows Azure when a source code commit is made to the repository (assuming the Build + Tests pass).  Enabling this is now really easy.  To set this up with a Windows Azure Web Site simply use the New->Compute->Web Site->Custom Create command inside the Windows Azure Management Portal.  This will create a dialog like below.  I gave the web site a name and then made sure the “Publish from source control” checkbox was selected: When we click next we’ll be prompted for the location of the source repository.  We’ll select “Team Foundation Services”: Once we do this we’ll be prompted for our Team Foundation Services account that our source repository is hosted under (in this case my TFS account is “scottguthrie”): When we click the “Authorize Now” button we’ll be prompted to give Windows Azure permissions to connect to the Team Foundation Services account.  Once we do this we’ll be prompted to pick the source repository we want to connect to.  Starting with today’s Windows Azure release you can now connect to both TFS and Git based source repositories.  This new support allows me to connect to the “SimpleContinuousDeploymentTest” respository we created earlier: Clicking the finish button will then create the Web Site with the continuous delivery hooks setup with Team Foundation Services.  Now every time someone pushes source control to the repository in Team Foundation Services, it will kick off an automated build, run all of the unit tests in the solution , and if they pass the app will be automatically deployed to our Web Site in Windows Azure.  You can monitor the history and status of these automated deployments using the Deployments tab within the Web Site: This enables a really slick continuous delivery workflow, and enables you to build and deploy apps in a really nice way. Developer Analytics: New Relic support for Web Sites + Mobile Services With today’s Windows Azure release we are making it really easy to enable Developer Analytics and Monitoring support with both Windows Azure Web Site and Windows Azure Mobile Services.  We are partnering with New Relic, who provide a great dev analytics and app performance monitoring offering, to enable this - and we have updated the Windows Azure Management Portal to make it really easy to configure. Enabling New Relic with a Windows Azure Web Site Enabling New Relic support with a Windows Azure Web Site is now really easy.  Simply navigate to the Configure tab of a Web Site and scroll down to the “developer analytics” section that is now within it: Clicking the “add-on” button will display some additional UI.  If you don’t already have a New Relic subscription, you can click the “view windows azure store” button to obtain a subscription (note: New Relic has a perpetually free tier so you can enable it even without paying anything): Clicking the “view windows azure store” button will launch the integrated Windows Azure Store experience we have within the Windows Azure Management Portal.  You can use this to browse from a variety of great add-on services – including New Relic: Select “New Relic” within the dialog above, then click the next button, and you’ll be able to choose which type of New Relic subscription you wish to purchase.  For this demo we’ll simply select the “Free Standard Version” – which does not cost anything and can be used forever:  Once we’ve signed-up for our New Relic subscription and added it to our Windows Azure account, we can go back to the Web Site’s configuration tab and choose to use the New Relic add-on with our Windows Azure Web Site.  We can do this by simply selecting it from the “add-on” dropdown (it is automatically populated within it once we have a New Relic subscription in our account): Clicking the “Save” button will then cause the Windows Azure Management Portal to automatically populate all of the needed New Relic configuration settings to our Web Site: Deploying the New Relic Agent as part of a Web Site The final step to enable developer analytics using New Relic is to add the New Relic runtime agent to our web app.  We can do this within Visual Studio by right-clicking on our web project and selecting the “Manage NuGet Packages” context menu: This will bring up the NuGet package manager.  You can search for “New Relic” within it to find the New Relic agent.  Note that there is both a 32-bit and 64-bit edition of it – make sure to install the version that matches how your Web Site is running within Windows Azure (note: you can configure your Web Site to run in either 32-bit or 64-bit mode using the Web Site’s “Configuration” tab within the Windows Azure Management Portal): Once we install the NuGet package we are all set to go.  We’ll simply re-publish the web site again to Windows Azure and New Relic will now automatically start monitoring the application Monitoring a Web Site using New Relic Now that the application has developer analytics support with New Relic enabled, we can launch the New Relic monitoring portal to start monitoring the health of it.  We can do this by clicking on the “Add Ons” tab in the left-hand side of the Windows Azure Management Portal.  Then select the New Relic add-on we signed-up for within it.  The Windows Azure Management Portal will provide some default information about the add-on when we do this.  Clicking the “Manage” button in the tray at the bottom will launch a new browser tab and single-sign us into the New Relic monitoring portal associated with our account: When we do this a new browser tab will launch with the New Relic admin tool loaded within it: We can now see insights into how our app is performing – without having to have written a single line of monitoring code.  The New Relic service provides a ton of great built-in monitoring features allowing us to quickly see: Performance times (including browser rendering speed) for the overall site and individual pages.  You can optionally set alert thresholds to trigger if the speed does not meet a threshold you specify. Information about where in the world your customers are hitting the site from (and how performance varies by region) Details on the latency performance of external services your web apps are using (for example: SQL, Storage, Twitter, etc) Error information including call stack details for exceptions that have occurred at runtime SQL Server profiling information – including which queries executed against your database and what their performance was And a whole bunch more… The cool thing about New Relic is that you don’t need to write monitoring code within your application to get all of the above reports (plus a lot more).  The New Relic agent automatically enables the CLR profiler within applications and automatically captures the information necessary to identify these.  This makes it super easy to get started and immediately have a rich developer analytics view for your solutions with very little effort. If you haven’t tried New Relic out yet with Windows Azure I recommend you do so – I think you’ll find it helps you build even better cloud applications.  Following the above steps will help you get started and deliver you a really good application monitoring solution in only minutes. Service Bus: Support for partitioned queues and topics With today’s release, we are enabling support within Service Bus for partitioned queues and topics. Enabling partitioning enables you to achieve a higher message throughput and better availability from your queues and topics. Higher message throughput is achieved by implementing multiple message brokers for each partitioned queue and topic.  The  multiple messaging stores will also provide higher availability. You can create a partitioned queue or topic by simply checking the Enable Partitioning option in the custom create wizard for a Queue or Topic: Read this article to learn more about partitioned queues and topics and how to take advantage of them today. Billing: New Billing Alert Service Today’s Windows Azure update enables a new Billing Alert Service Preview that enables you to get proactive email notifications when your Windows Azure bill goes above a certain monetary threshold that you configure.  This makes it easier to manage your bill and avoid potential surprises at the end of the month. With the Billing Alert Service Preview, you can now create email alerts to monitor and manage your monetary credits or your current bill total.  To set up an alert first sign-up for the free Billing Alert Service Preview.  Then visit the account management page, click on a subscription you have setup, and then navigate to the new Alerts tab that is available: The alerts tab allows you to setup email alerts that will be sent automatically once a certain threshold is hit.  For example, by clicking the “add alert” button above I can setup a rule to send myself email anytime my Windows Azure bill goes above $100 for the month: The Billing Alert Service will evolve to support additional aspects of your bill as well as support multiple forms of alerts such as SMS.  Try out the new Billing Alert Service Preview today and give us feedback. Summary Today’s Windows Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Partner Blog: Hub City Media Introduces iPad Application for Oracle Identity Analytics

    - by Tanu Sood
    About the Writer:Steve Giovannetti is CTO of Hub City Media, Inc., a company that specializes in implementation and product development on the Oracle Identity Management platform. Recently, Hub City Media announced the introduction of iPad application IdentityCert for Oracle Identity Analytics. This post explore the business use cases and application of IdentityCert.Hub City Media(HCM) has been deploying certification solutions based on Oracle Identity Analytics since it first appeared on the market as Vaau RBACx. With each deployment we've seen the same pattern repeat time and time again:1. Customers suffering under the weight of manual access certification regimens deploy Oracle Identity Analytics (OIA) for automated certification. 2. OIA improves the frequency, speed, accuracy, and participation of certifications across the organization. 3. Then the certifiers, typically managers and supervisors, ask, “Is there any easier way to do these certifications offline?”The current version of OIA has a way to export certification data to a spreadsheet.  For some customers, we've leveraged this feature and combined it with some of our own custom code to provide a solution based on spreadsheet exports and imports.  Customers export the certification to Microsoft Excel, complete it, and then import the spreadsheet to OIA. It worked well for offline certification, but if the user accidentally altered the format of the spreadsheet, the import of the data could fail. We were close to a solution but it wasn’t reliable.Over the past few years, we've seen the proliferation of Apple iOS devices, specifically the iPhone and iPad, in the enterprise.  As our customers were asking for offline certification, we noticed the same population of users traditionally responsible for access certification, were early adopters of the iPad. The environment seemed ideal for us to create an iPad application to support offline certifications using Oracle Identity Analytics. That’s why we created IdentityCert™.IdentityCert allows users to view their analytics dashboard, complete user certifications, and resolve policy violations with OIA, from their iPads.The current IdentityCert analytics dashboard displays the same charts that are available in the Oracle Identity Analytics product. However, we plan to expand the number of available analytics in future releases.The main function of IdentityCert is user certification which can be performed quickly and efficiently using a simple touch interface. Managers tap into a certification, use simple gestures to claim users and certify their access.  Certifications can be securely downloaded to IdentityCert and can be completed with or without a network connection. The user can upload the completed certifications once they are connected to a cellular or wi-fi network.Oracle Identity Analytics can generate policy violation notifications based on detective scans of identity warehouse or via preventative analysis of identity access requests. IdentityCert allows users to view all policy violations, resolve, or delegate them to appropriate users. IdentityCert also analyzes the policy violation expression and produces more human friendly descriptions of the policy violation which improves the ability of users to resolve the violation. IdentityCert can be deployed quickly into a customer's environment. It is deployed with Hub City Media's ID Services to connect Oracle Identity Analytics securely with the iPad application.Oracle Identity Management 11g R2 is an important evolutionary release. Oracle's Identity Management suite has more characteristics of a cohesive platform. This platform provides an integrated set of identity services that can be used to protect, manage, and audit security within the enterprise. At HCM we take the platform concept a step further and see it as an opportunity to create unique solutions for Oracle Identity Management customers. IdentityCert is our commitment to this platform. You can download IdentityCert from the Apple iOS App Store today. It includes a demo dataset that you can use to explore the functions of the product without any server infrastructure. Download it. Give it a try. We would appreciate your interest and welcome any feedback.Resources:Press Release: Hub City Media Introduces iPad Application IdentityCert™ for Oracle Identity AnalyticsApp Store Download: http://bit.ly/IdentityCertOracle Identity Governance Suite

    Read the article

  • Taking the training wheels off: Accelerating the Business with Oracle IAM by Brian Mozinski (Accenture)

    - by Greg Jensen
    Today, technical requirements for IAM are evolving rapidly, and the bar is continuously raised for high performance IAM solutions as organizations look to roll out high volume use cases on the back of legacy systems.  Existing solutions were often designed and architected to support offline transactions and manual processes, and the business owners today demand globally scalable infrastructure to support the growth their business cases are expected to deliver. To help IAM practitioners address these challenges and make their organizations and themselves more successful, this series we will outline the: • Taking the training wheels off: Accelerating the Business with Oracle IAM The explosive growth in expectations for IAM infrastructure, and the business cases they support to gain investment in new security programs. • "Necessity is the mother of invention": Technical solutions developed in the field Well proven tricks of the trade, used by IAM guru’s to maximize your solution while addressing the requirements of global organizations. • The Art & Science of Performance Tuning of Oracle IAM 11gR2 Real world examples of performance tuning with Oracle IAM • No Where to go but up: Extending the benefits of accelerated IAM Anything is possible, compelling new solutions organizations are unlocking with accelerated Oracle IAM Let’s get started … by talking about the changing dynamics driving these discussions. Big Companies are getting bigger everyday, and increasingly organizations operate across state lines, multiple times zones, and in many countries or continents at the same time.  No longer is midnight to 6am a safe time to take down the system for upgrades, to run recon’s and import or update user accounts and attributes.  Further IT organizations are operating as shared services with SLA’s similar to telephone carrier levels expected by their “clients”.  Workers are moved in and out of roles on a weekly, daily, or even hourly rate and IAM is expected to support those rapid changes.  End users registering for services during business hours in Singapore are expected their access to be green-lighted in custom apps hosted in Portugal within the hour.  Many of the expectations of asynchronous systems and batched updates are not adequate and the number and types of users is growing. When organizations acted more like independent teams at functional or geographic levels it was manageable to have processes that relied on a handful of people who knew how to make things work …. Knew how to get you access to the key systems to get your job done.  Today everyone is expected to do more with less, the finance administrator previously supporting their local Atlanta sales office might now be asked to help close the books for the Johannesburg team, and access certification process once completed monthly by Joan on the 3rd floor is now done by a shared pool of resources in Sao Paulo.   Fragmented processes that rely on institutional knowledge to get access to systems and get work done quickly break down in these scenarios.  Highly robust processes that have automated workflows for connected or disconnected systems give organizations the dynamic flexibility to share work across these lines and cut costs or increase productivity. As the IT industry computing paradigms continue to change with the passing of time, and as mature or proven approaches become clear, it is normal for organizations to adjust accordingly. Businesses must manage identity in an increasingly hybrid world in which legacy on-premises IAM infrastructures are extended or replaced to support more and more interconnected and interdependent services to a wider range of users. The old legacy IAM implementation models we had relied on to manage identities no longer apply. End users expect to self-request access to services from their tablet, get supervisor approval over mobile devices and email, and launch the application even if is hosted on the cloud, or run by a partner, vendor, or service provider. While user expectations are higher, they are also simpler … logging into custom desktop apps to request approvals, or going through email or paper based processes for certification is unacceptable.  Users expect security to operate within the paradigm of the application … i.e. feel like the application they are using. Citizen and customer facing applications have evolved from every where, with custom applications, 3rd party tools, and merging in from acquired entities or 3rd party OEM’s resold to expand your portfolio of services.  These all have their own user stores, authentication models, user lifecycles, session management, etc.  Often the designers/developers are no longer accessible and the documentation is limited.  Bringing together underlying directories to scale for growth, and improve user experience is critical for revenue … but also for operations. Job functions are more dynamic.... take the Olympics for example.  Endless organizations from corporations broadcasting, endorsing, or marketing through the event … to non-profit athletic foundations and public/government entities for athletes and public safety, all operate simultaneously on the world stage.  Each organization needs to spin up short-term teams, often dealing with proprietary information from hot ads to racing strategies or security plans.  IAM is expected to enable team’s to spin up, enable new applications, protect privacy, and secure critical infrastructure.  Then it needs to be disabled just as quickly as users go back to their previous responsibilities. On a more technical level … Optimized system directory; tuning guidelines and parameters are needed by businesses today. Business’s need to be making the right choices (virtual directories) and considerations via choosing the correct architectural patterns (virtual, direct, replicated, and tuning), challenge is that business need to assess and chose the correct architectural patters (centralized, virtualized, and distributed) Today's Business organizations have very complex heterogeneous enterprises that contain diverse and multifaceted information. With today's ever changing global landscape, the strategic end goal in challenging times for business is business agility. The business of identity management requires enterprise's to be more agile and more responsive than ever before. The continued proliferation of networking devices (PC, tablet, PDA's, notebooks, etc.) has caused the number of devices and users to be granted access to these devices to grow exponentially. Business needs to deploy an IAM system that can account for the demands for authentication and authorizations to these devices. Increased innovation is forcing business and organizations to centralize their identity management services. Access management needs to handle traditional web based access as well as handle new innovations around mobile, as well as address insufficient governance processes which can lead to rouge identity accounts, which can then become a source of vulnerabilities within a business’s identity platform. Risk based decisions are providing challenges to business, for an adaptive risk model to make proper access decisions via standard Web single sign on for internal and external customers,. Organizations have to move beyond simple login and passwords to address trusted relationship questions such as: Is this a trusted customer, client, or citizen? Is this a trusted employee, vendor, or partner? Is this a trusted device? Without a solid technological foundation, organizational performance, collaboration, constituent services, or any other organizational processes will languish. A Single server location presents not only network concerns for distributed user base, but identity challenges. The network risks are centered on latency of the long trip that the traffic has to take. Other risks are a performance around availability and if the single identity server is lost, all access is lost. As you can see, there are many reasons why performance tuning IAM will have a substantial impact on the success of your organization.  In our next installment in the series we roll up our sleeves and get into detailed tuning techniques used everyday by thought leaders in the field implementing Oracle Identity & Access Management Solutions.

    Read the article

  • Firewall predefined rule property cannot be modified

    - by Sami-L
    Using a stand alone Windows Server 2012 Standard edition (no Active Directory), I Tried to establish a simple remote desktop with a custom port number, but could not modify the port number in the Firewall inbound rule, when I open the inbound property I get the next message: "This is a predefined rule and some of its properties cannot be modified" I have tried to set it up like this: New rule - predifined drop down list - Remote Desktop - check mark rules - Allow the connection. but still get "This is a predefined rule and some of its properties cannot be modified" Thank you in advance.

    Read the article

  • Execute a remote command after sudo - su anotheruser in Rundeck

    - by Bera
    I'm new with Rundeck and completely amazed with it and I'm trying to execute a job and my scenario is detailed below: Rundeck is configured with ssh passwordless authentication for user master between node Server (rundeck server) and node Target (remote Solaris host) for user "master" In node Target I want to execute a script /app/acme/stopApp.sh with a user appmanager Normally and manually, when I need to run script above I proceed with ssh master@server sudo su - appmanager or simply ssh -t master@server 'sudo su - appmanager' works without password and finally run (as appmanager) /app/acme/stopApp.sh But I'm not sure how can I reproduce these steps using Rundeck. I read in some previous messages that for each job line rundeck use a new ssh connection, so the workflow below always fails for me with the messages: sudo: no tty present and no askpass program specified Remote command failed with exit status 1 Please someone could help me with some information to solve this issue. Without this functionality I wouldn't be able to introduce a little DevOps in my department. I read the user guide and admin guide but I couldn't find an easy example, neither in this forum, to follow.

    Read the article

  • How to enable remote device manager?

    - by Petoj
    I have a hyper-v-server-2012-r2 Core server thats not joined to any domain. What configuration must i do on the server and client to connect with MMC \ Device Manager? I have managed to connect with the Computer Management to the server and i can access every thing in there but the device manager (and a few more that im not interested in)... When i click the Device manager i get the following error: I have checked and both services are running.. If possible i would like a step by step guide what to do on a freshly installed Hyper-v Core server to get every thing in Computer Management remotely accessible including Device Manager.

    Read the article

  • Keeping local windows folder in sync with remote ftp folder in real time

    - by bobo
    I know it has been asked before, but I would like it to happen in real time and transparently (without the need to open a separate FTP client such as FileZilla). For example, if I edit a text file in the local folder and then save it, it should immediately detect it and push the changes to the remote folder. It can be unidirectional (changes made on the local folder has to be pushed to the remote folder but the reverse is not necessary). It should also be able to specify some excluded files/folders which do not need to be in sync. Is there such an application that you know of?

    Read the article

  • Unknown protocol when trying to connect to remote host wit stunnel

    - by RaYell
    I'm trying to set up a stunnel for WebDav on Windows. I want to connect 80 port on my local interface to 443 on another machine in my network. I can ping the machine remote machine. However when I use the tunnel, I'm getting this error all the time SSL state (accept): before/accept initialization SSL_accept: 140760FC: error:140760FC:SSL routines:SSL23_GET_CLIENT_HELLO:unknown protocol There is nothing in the logs on the other machine and here's my stunnel connection config [https] accept = 127.0.0.2:80 connect = 10.0.0.60:443 verify = 0 I've set it up to accept all certificates so this shouldn't be a problem with a self-signed certificate remote host uses. Does anyone knows what might be the problem that this connection cannot be eastablished?

    Read the article

< Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >