Search Results

Search found 11640 results on 466 pages for 'share credentials'.

Page 132/466 | < Previous Page | 128 129 130 131 132 133 134 135 136 137 138 139  | Next Page >

  • Juju bootstrap, install

    - by Robert G.
    I would like to test MAAS + JUJU + OpenStack (I followed the documentation on maas.ubuntu.org) I already made a test environment: 1 MAAS server wich will also run JuJu 10 KVM servers for Openstack The KVM servers are already in "ready" state in MAAS. I would like to set up JuJu but i could not which is drives me crazy. My environments.yaml: environments: maassrv: type: maas maas-server: 'http://${192.168.1.116}/MAAS/' maas-oauth: 'my-key-from-maas' authorized-keys-path: /root/.ssh/id_rsa.pub admin-secret: 1234 default-series: trusty When I run "juju status -e maassrv` : ERROR Unable to connect to environment "maassrv". Please check your credentials or use 'juju bootstrap' to create a new environment. Error details: environment "maassrv" not found OK, it's right, so i should run juju bootstrap -e maassrv: ERROR environment "maassrv" not found When i run the command without the -e switch: error: no environment specified So, I am stuck here, I already added the required ssh keys to maas too. I ran out of ideas why it isn't working.

    Read the article

  • Why is permadeath essential to a roguelike design?

    - by Gregory Weir
    Roguelikes and roguelike-likes (Spelunky, The Binding of Isaac) tend to share a number of game design elements: Procedurally generated worlds Character growth by way of new abilities and powers Permanent death I can understand why starting with permadeath as a premise would lead you to the other ideas: if you're going to be starting over a lot, you'll want variety in your experiences. But why do the first two elements imply a permadeath approach?

    Read the article

  • What’s coming up

    - by GavinPayneUK
    In the last couple of months my community activities list has had things leave it and new things join it, so I thought share, and promote, my future plans. Microsoft Certified Architect : SQL Server – Giving back Preparing for my MCA Board was the hardest, yet in hindsight the most rewarding and interesting, thing I’ve ever done.  The subjects it covers still interest me to the extent that I’m now contributing to the MCA programme itself, allowing the next people through the certification’s...(read more)

    Read the article

  • Quick Poll: Certification Information Preferences

    - by Paul Sorensen
    We're starting a new "quick poll" series so that we can better learn about you - our technical professionals who are either already Oracle certified or working on earning an Oracle credential. We aim to keep them short (~1 minute to answer) so that you'll share your opinion.This week we want to know how you prefer to get your information about Oracle Certification:TAKE THE QUICK POLLNOTE: You can only take the survey once per machine. (if you try a second time it may redirect you to an external website)

    Read the article

  • Unable to list windows shares from terminal.

    - by karthick87
    I am unable to list windows shares from terminal. I am getting the following error, root@ITSTA2:~# smbclient -L 172.XX.XX.XX -U john params.c:Parameter() - Ignoring badly formed line in configuration file: # Samba config file WARNING: The "Share modes" option is deprecated Unknown parameter encountered: "read Size" Ignoring unknown parameter "read Size" Enter john's password: Unknown socket option SO_KEEPLIVE session setup failed: NT_STATUS_LOGON_FAILURE Someone sort out the problem pls..

    Read the article

  • links for 2011-01-06

    - by Bob Rhubart
    Coming to your town: Oracle Enterprise Cloud Summit During these full-day events, cloud experts will share real-world best practices, reference architectures, detailed customer case studies, and more. Events scheduled in cities around the world.  (tags: oracle otn cloud event) Webcast: Security and Compliance for Private Cloud Consolidation Roxana Bradescu, Senior Director for Oracle Database Security Products, discusses Oracle Database Security Solutions to securely consolidate data and meet compliance requirements within private cloud computing environments. Thursday, January 13, 2011. 10am PST | 1pm EST (tags: oracle cloud security) Answering Questions about Mobile Devices | The AppsLab "How do the numbers of Android and iOS users compare? How often are people switching? Where are all these BlackBerry and Nokia users? Do they plan to jump to Android or iOS? What about webOS? Is it relevant?" Some answers in this AppsLab survey. (tags: oracle otn enterprise2.0 mobilecomputing iphone blackberry android) Webcast: Achieve 24/7 Cloud Availability Without Expensive Redundancy Ashish Ray and Matthew Baier discuss Oracle’s Maximum Availability Architecture and Oracle Database 11g. (tags: oracle cloud highavailability webcast) Converting a PV vm back into an HVM vm (Wim Coekaerts Blog) "I wanted to convert one of my VMs that was based on a paravirt kernel into a vm that just boots as a regular hardware virt VM with a standard x86-64 kernel...It took me a little while to figure out the fastest way so now that I have it pretty much down I wanted to share the steps." - Wim Coekaerts (tags: oracle otn virtualization oraclevm) @OTN_Garage: Resources for VirtualBox 4.0 Rick "@OTN_Garage" Ramsey shares links to several resources for those with a VirtualBox jones. (tags: oracle otn virtualization virtualbox) 'Federal Service Bus' Helps Belgian Government Speak a Common Language - SOA in Action Blog "The first SOA-enabled application was developed in less than two months and was fully operational in approximately 10 weeks. In addition, new FSB modules are reusable for other Belgian e-government applications, saving both time and taxpayer dollars." - Joe McKendrick (tags: soa oracle) Show Notes: Architects in the Cloud (ArchBeat Podcast) The complete 4-part interview with Stephen G. Bennett and Archie Reed, the authors of "Silver Clouds, Dark Linings: A Concise Guide to Cloud Computing," is now available. (tags: oracle otn cloud podcast archbeat)

    Read the article

  • Ati Radeon HD 3200 Graphics driver - Installation Problem

    - by samufi
    I have a fresh installation of Ubuntu 12.04 x86 and I am trying to install the proprietary driver for my "Radeon HD 3200 Graphics" video card. I know that there are already many threads about this topic, but I did not find a solution for my problem: For the installation I followed exactly these instructions: What is the correct way to install ATI Catalyst Video Drivers in 12.04 LTS? During the process I faced these problems: I executed ~$ debconf libstdc++6 dkms libqtgui4 wget execstack libelfg0 dh-modaliases and got: debconf: DbDriver "passwords" warning: could not open /var/cache/debconf/passwords.dat: Keine Berechtigung Can't exec "libstdc++6": Datei oder Verzeichnis nicht gefunden at /usr/share/perl/5.14/IPC/Open3.pm line 186. open2: exec of libstdc++6 dkms libqtgui4 wget execstack libelfg0 dh-modaliases failed at /usr/share/perl5/Debconf/ConfModule.pm line 59 (translation of the German parts: "Keine Berechtigung" means: "no permission"; "Datei oder Verzeichnis nicht gefunden" means: "File or folder not found") Because I had no idea if it was a big issue, I just continued: ~$ sudo apt-get install ia32-libs There I got: Paketlisten werden gelesen... Fertig Abhängigkeitsbaum wird aufgebaut Statusinformationen werden eingelesen... Fertig Paket ia32-libs ist nicht verfügbar, wird aber von einem anderen Paket referenziert. Das kann heißen, dass das Paket fehlt, dass es abgelöst wurde oder nur aus einer anderen Quelle verfügbar ist. E: Paket »ia32-libs« hat keinen Installationskandidaten (Translation: [...] the package ia32-libs is not available but is referenced by an other package [...] E: package »ia32-libs« has no installation candidate) Once more I went on. The next steps worked quite fine. But when I came to the point: ~$ sudo dpkg -i *.deb There I got A popup message, something like there was a problem with a system application but in the terminal no errors were reported, also the packages seemed to be installed. so now the Ati Catalyst Center works amdcccle but fglrxinfo gave me X Error of failed request: BadRequest (invalid request code or no such operation) Major opcode of failed request: 139 (ATIFGLEXTENSION) Minor opcode of failed request: 66 () Serial number of failed request: 13 Current serial number in output stream: 13 So there is something wrong. (Also there is not the possibility to enable these nice graphical features - the reason why I installed the proprietary driver) Because I worked with a completely fresh Installation I don't know how to fix the problem. If anybody could help I would be very tahnkful! =)

    Read the article

  • What is safe to exclude for a full system backup?

    - by seb
    Hi, I'm looking for a list which paths/files are safe to exclude for a full system/home backup. Considering that I have a list of installed packages. /home/*/.thumbnails /home/*/.cache /home/*/.mozilla/firefox/*.default/Cache /home/*/.mozilla/firefox/*.default/OfflineCache /home/*/.local/share/Trash /home/*/.gvfs/ /tmp/ /var/tmp/ not real folders but can cause severe problems when 'restoring' /dev /proc /sys What about... /var/ in general? /var/backups/ - can get quite large /var/log/ - does not require much space and can help for later comparison /lost+found/

    Read the article

  • SQLAuthority News – 5th Anniversary Giveaways

    - by pinaldave
    Please read my 5th Anniversary post and my quick note on history of the Database. I am sure that we all have friends and we value friendship more than anything. In fact, the complete model of Facebook is built on friends. If you have lots of friends, you must be a lucky person. Having a lot of friends is indeed a good thing. I consider all you blog readers as my friends so now I want do something for you. What is it? Well, send me details about how many of your friends like my page and you would have a chance to win lots of learning materials for yourself and your friends. Here are the exciting prizes awaiting the lucky winner: Combo set of 5 Joes 2 Pros Book – 1 for YOU and 1 for Friend This is USD 444 (each set USD 222) worth gift. It contains all the five Joes 2 Pros books (Vol1, Vol2, Vol3, Vol4, Vol5) + 1 Learning DVD. [Amazon] | [Flipkart] If in case you submitted an entry but didn’t win the Combo set of 5 Joes 2 Pros books, you could still will  my SQL Server Wait Stats book as a consolation prize! I will pick the next 5 participants who have the highest number of friends who “liked” the Facebook page, http://facebook.com/SQLAuth. Instead of sending one copy, I will send you 2 copies so you can share one copy with a friend of yours. Well, it is important to share our learning and love with friends, isn’t it? Note: Just take a screenshot of http://facebook.com/SQLAuth using Print Screen function and send it by Nov 7th to pinal ‘at’ sqlauthority.com.. There are no special freebies to early birds so take your time and see if you can increase your friends like count by Nov 7th. Guess – What is in it? It is quite possible you are not a Facebook or Twitter user. In that case you can still win a surprise from me. You have 2 days to guess what is in this box. If you guess it correct and you are one of the first 5 persons to have the correct answer – you will get what is in this box for free. Please note that you have only 48 hours to guess. Please give me your guess by commenting to this blog post. Reference:  Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, Pinal Dave, PostADay, Readers Contribution, Readers Question, SQL, SQL Authority, SQL Milestone, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • Legal responsibility for emebedding code

    - by Tom Gullen
    On our website we have an HTML5 arcade. For each game it has an embed this game on your website copy + paste code box. We've done the approval process of games as strictly and safely as possible, we don't actually think it is possible to have any malicious code in the games. However, we are aware that there's a bunch of people out there smarter than us and they might be able to exploit it. For webmasters wanting to copy + paste our games on their websites, we want to warn them that they are doing it at their own risk - but could we be held responsible if say for instance a malicious game was hosted on an important website and it stole their users credentials and cause them damage? I'm wondering if having an HTML comment in the copy + paste code saying "Use at your own risk" is sufficient.

    Read the article

  • WIF, ADFS 2 and WCF&ndash;Part 6: Chaining multiple Token Services

    - by Your DisplayName here!
    See the previous posts first. So far we looked at the (simpler) scenario where a client acquires a token from an identity provider and uses that for authentication against a relying party WCF service. Another common scenario is, that the client first requests a token from an identity provider, and then uses this token to request a new token from a Resource STS or a partner’s federation gateway. This sounds complicated, but is actually very easy to achieve using WIF’s WS-Trust client support. The sequence is like this: Request a token from an identity provider. You use some “bootstrap” credential for that like Windows integrated, UserName or a client certificate. The realm used for this request is the identifier of the Resource STS/federation gateway. Use the resulting token to request a new token from the Resource STS/federation gateway. The realm for this request would be the ultimate service you want to talk to. Use this resulting token to authenticate against the ultimate service. Step 1 is very much the same as the code I have shown in the last post. In the following snippet, I use a client certificate to get a token from my STS: private static SecurityToken GetIdPToken() {     var factory = new WSTrustChannelFactory(         new CertificateWSTrustBinding(SecurityMode.TransportWithMessageCredential,         idpEndpoint);     factory.TrustVersion = TrustVersion.WSTrust13;       factory.Credentials.ClientCertificate.SetCertificate(         StoreLocation.CurrentUser,         StoreName.My,         X509FindType.FindBySubjectDistinguishedName,         "CN=Client");       var rst = new RequestSecurityToken     {         RequestType = RequestTypes.Issue,         AppliesTo = new EndpointAddress(rstsRealm),         KeyType = KeyTypes.Symmetric     };       var channel = factory.CreateChannel();     return channel.Issue(rst); } To use a token to request another token is slightly different. First the IssuedTokenWSTrustBinding is used and second the channel factory extension methods are used to send the identity provider token to the Resource STS: private static SecurityToken GetRSTSToken(SecurityToken idpToken) {     var binding = new IssuedTokenWSTrustBinding();     binding.SecurityMode = SecurityMode.TransportWithMessageCredential;       var factory = new WSTrustChannelFactory(         binding,         rstsEndpoint);     factory.TrustVersion = TrustVersion.WSTrust13;     factory.Credentials.SupportInteractive = false;       var rst = new RequestSecurityToken     {         RequestType = RequestTypes.Issue,         AppliesTo = new EndpointAddress(svcRealm),         KeyType = KeyTypes.Symmetric     };       factory.ConfigureChannelFactory();     var channel = factory.CreateChannelWithIssuedToken(idpToken);     return channel.Issue(rst); } For this particular case I chose an ADFS endpoint for issued token authentication (see part 1 for more background). Calling the service now works exactly like I described in my last post. You may now wonder if the same thing can be also achieved using configuration only – absolutely. But there are some gotchas. First of all the configuration files becomes quite complex. As we discussed in part 4, the bindings must be nested for WCF to unwind the token call-stack. But in this case svcutil cannot resolve the first hop since it cannot use metadata to inspect the identity provider. This binding must be supplied manually. The other issue is around the value for the realm/appliesTo when requesting a token for the R-STS. Using the manual approach you have full control over that parameter and you can simply use the R-STS issuer URI. Using the configuration approach, the exact address of the R-STS endpoint will be used. This means that you may have to register multiple R-STS endpoints in the identity provider. Another issue you will run into is, that ADFS does only accepts its configured issuer URI as a known realm by default. You’d have to manually add more audience URIs for the specific endpoints using the ADFS Powershell commandlets. I prefer the “manual” approach. That’s it. Hope this is useful information.

    Read the article

  • Customizing Flowcharts in Oracle Tutor

    - by [email protected]
    Today we're going to look at how you can customize the flowcharts within Oracle Tutor procedures, and how you can share those changes with other authors within your company. Here is an image of a flowchart within a Tutor procedure with the default size and color scheme. You may want to change the size of your flowcharts as your end-users might have larger screens or need larger fonts. To change the size and number of columns, navigate to Tutor Author Author Options Flowcharts. The default is to have 4 columns appear in each flowchart, but, if I change it to six, my end-users will see a denser flowchart. This might be too dense for my end-users, so I will change it to 5 columns, and I will also deselect the option to have separate task boxes. Now let's look at how to customize the colors. Within the Flowchart options dialog, there is a button labeled "Colors." This brings up a dialog box of every object on a Tutor flowchart, and I can modify the color of each object, as well as the text within the object. If I click on the background, the "page" object appears in the Item field, and now I can customize the color and the title text by selecting Select Fill Color and/or Select Text Color. A dialog box with color choices appears. If I select Define Custom Colors, I can make my selections even more precise. Each time I change the color of an object, it appears in the selection screen. When the flowchart customization is finished, I can save my changes by naming the scheme. Although the color scheme I have chosen is rather silly looking, perhaps I want others to give me their feedback and make changes as they wish. I can share the color scheme with them by copying the FCP.INI file in the Tutor\Author directory into the same directory on their systems. If the other users have color schemes that they do not want to lose, they can copy the relevant lines from the FCP.INI file into their file. If I flowchart my document with the new scheme, I can see how it looks within the document. Sometimes just one or two changes to the default scheme are enough to customize the flowchart to your company's color palette. I have seen customers who have only changed the Start object to green and the End object to red, and I've seen another customer who changed every object to some variant of black and orange. Experiment! And let us know how you have customized your flowcharts. Mary R. Keane Senior Development Director, Oracle Tutor

    Read the article

  • which default.list should i modify for default applications and what are the differences between the 2

    - by damien
    I would like to add miro to the default application GUI in system settings/default applications.I added ;miro.desktopnext to all rhythmbox.desktop entries eventually discovering if it was not added to audio/x-vorbis+ogg=rhythmbox.desktop as audio/x-vorbis+ogg=rhythmbox.desktop;miro.desktop it would not appear in the system settings/default applications drop down list for audio. I can find default.list in either /etc/gnome/defaults.list or /usr/share/applications/defaults.list modifying either gives me the same results.What is the difference and which is the correct list to modify?

    Read the article

  • How do your busiest people transfer their knowledge?

    - by Wikis Commit At Area 51
    We have recently polled our company wide wiki users and found out that there are two large groups of users: people with lots of knowledge but (who claim they have) no time to document people with time but (who claim they have) not enough knowledge worth documenting Each group covered almost 50% of the users! How do your companies handle this? That is, how do you encourage your busiest / most knowledgeable people to share their knowledge?

    Read the article

  • Virtual Box - How to open a .VDI Virtual Machine

    - by [email protected]
    TUESDAY, APRIL 27, 2010 How to open a .VDI Virtual MachineSometimes someone share with us one Virtual machine with extension .VDI, after that we can wonder how and what with?Well the answer is... It is a VirtualBox - Virtual Machine. If you have not downloaded it you can do this easily just follow this post.http://listeningoracle.blogspot.com/2010/04/que-es-virtualbox.htmlorhttp://oracleoforacle.wordpress.com/2010/04/14/ques-es-virtualbox/Ok, Now with VirtualBox Installed open it and proceed with the following:1. Open the Virtual File Manager.2. Click on Actions ? Add and select the .VDI fileClick "Ok"3. Now we can register the new Virtual Machine - Click New, and Click Next4. Write down a Name for the virtual Machine a proceed to select a Operating System and Version. (In this case it is a Linux (Oracle Enterprise Linux or RedHat)Click Next5. Select the memory amount base for the Virtual Machine(Minimal 1280 for our case) - Click Next6. Select the Disk 11GR2_OEL5_32GB.vdi it was added in the virtual media manager in the step 2.Dont forget let selected Boot hard Disk (Primary Master) . Given it is the only disk assigned to the virtual machine.Click Next7. Click Finish8. This step is important. Once you have click on the settings Button. 9. On General option click the advanced settings. Here you must change the default directory to save your Snapshots; my recommendation set it to the same directory where the .Vdi file is. Otherwise you can have the same Virtual Machine and its snapshots in different paths.10. Now Click on System, and proceed to assign the correct memory (If you did not before)Note: Enable "Enable IO APIC" if you are planning to assign more than one CPU to the Virtual Machine.Define the processors for the Virtual machine. If you processor is dual core choose 211. Select the video memory amount you want to assign to the Virtual Machine12. Associated more storage disk to the Virtual machine, if you have more VDI files.(Not our case)The disk must be selected as IDE Primary Master.13. Well you can verify the other options, but with these changes you will be able to start the VM.Note: Sometime the VM owner may share some instructions, if so follow his instructions.14. Finally Start the Virtual Machine (Click > Start)

    Read the article

  • SQLAuthority News – SQL Server Wait Stats – eBook to Download on Kindle – Answer to FREE PDF Download Request

    - by pinaldave
    Being a book author is a completely new experience for me. I am yet to come across the issues faced by expert book authors. I assume that these interesting issues can be routine ones for expert book authors. One of the biggest requests I am getting for my SQL Server Wait Stats [Amazon] | [Flipkart] | [Kindle] book is my humble attempt to write a book. This is our very first experiment, and the book is beginning of the subject of SQL Server Wait Stats; we will come up with a new version of the book later next year when we have enough information for the SQL Server 2012 version. Following are the top 2 requests that I keep on receiving in emails, on blogs, Twitter, and Facebook. “Please send us FREE PDF of your book so we do not have to purchase it.” “If you can share with us the eBook (free and downloadable) format of your book, we will share it with everybody we know and you will get additional exposure.” Here is my response for the abovementioned requests: If you really need my book and cannot purchase it due to financial trouble, then feel free to let me know and I will purchase it myself and ship it to you. If you are in a country where the print book not available, then you can buy the Kindle book, which is available online in any country, and you can just read it on your computer and mobile devices. You DO NOT have to own a Kindle to read a Kindle format book. You can freely download Kindle software on your desired format and purchase the book online. For next 5 days, the kindle book is available at 3.99 in USA, and in other countries, the price is anywhere between 3.99 and 5.99. The price will go up by USD 2 everywhere across the world after 1st November, 2011. Here is the link to download Kindle Software for free PC, WP7, and in marketplace for various other mobile devices. I thank you for giving warm response to SQL Server Wait Stats book. I am motivated to write the next expanded version of this book. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Announcing: Improvements to the Windows Azure Portal

    - by ScottGu
    Earlier today we released a number of enhancements to the new Windows Azure Management Portal.  These new capabilities include: Service Bus Management and Monitoring Support for Managing Co-administrators Import/Export support for SQL Databases Virtual Machine Experience Enhancements Improved Cloud Service Status Notifications Media Services Monitoring Support Storage Container Creation and Access Control Support All of these improvements are now live in production and available to start using immediately.  Below are more details on them: Service Bus Management and Monitoring The new Windows Azure Management Portal now supports Service Bus management and monitoring. Service Bus provides rich messaging infrastructure that can sit between applications (or between cloud and on-premise environments) and allow them to communicate in a loosely coupled way for improved scale and resiliency. With the new Service Bus experience, you can now create and manage Service Bus Namespaces, Queues, Topics, Relays and Subscriptions. You can also get rich monitoring for Service Bus Queues, Topics and Subscriptions. To create a Service Bus namespace, you can now select the “Service Bus” tab in the Windows Azure portal and then simply select the CREATE command: Doing so will bring up a new “Create a Namespace” dialog that allows you to name and create a new Service Bus Namespace: Once created, you can obtain security credentials associated with the Namespace via the ACCESS KEY command. This gives you the ability to obtain the connection string associated with the service namespace. You can copy and paste these values into any application that requires these credentials: It is also now easy to create Service Bus Queues and Topics via the NEW experience in the portal drawer.  Simply click the NEW command and navigate to the “App Services” category to create a new Service Bus entity: Once you provision a new Queue or Topic it can be managed in the portal.  Clicking on a namespace will display all queues and topics within it: Clicking on an item in the list will allow you to drill down into a dashboard view that allows you to monitor the activity and traffic within it, as well as perform operations on it. For example, below is a view of an “orders” queue – note how we now surface both the incoming and outgoing message flow rate, as well as the total queue length and queue size: To monitor pub/sub subscriptions you can use the ADD METRICS command within a topic and select a specific subscription to monitor. Support for Managing Co-Administrators You can now add co-administrators for your Windows Azure subscription using the new Windows Azure Portal. This allows you to share management of your Windows Azure services with other users. Subscription co-administrators share the same administrative rights and permissions that service administrator have - except a co-administrator cannot change or view billing details about the account, nor remove the service administrator from a subscription. In the SETTINGS section, click on the ADMINISTRATORS tab, and select the ADD button to add a co-administrator to your subscription: To add a co-administrator, you specify the email address for a Microsoft account (formerly Windows Live ID) or an organizational account, and choose the subscription you want to add them to: You can later update the subscriptions that the co-administrator has access to by clicking on the EDIT button, and then select or deselect the subscriptions to which they belong. Import/Export Support for SQL Databases The Windows Azure administration portal now supports importing and exporting SQL Databases to/from Blob Storage.  Databases can be imported/exported to blob storage using the same BACPAC file format that is supported with SQL Server 2012.  Among other benefits, this makes it easy to copy and migrate databases between on-premise and cloud environments. SQL Databases now have an EXPORT command in the bottom drawer that when pressed will prompt you to save your database to a Windows Azure storage container: The UI allows you to choose an existing storage account or create a new one, as well as the name of the BACPAC file to persist in blob storage: You can also now import and create a new SQL Database by using the NEW command.  This will prompt you to select the storage container and file to import the database from: The Windows Azure Portal enables you to monitor the progress of import and export operations. If you choose to log out of the portal, you can come back later and check on the status of all of the operations in the new history tab of the SQL Database server – this shows your entire import and export history and the status (success/fail) of each: Enhancements to the Virtual Machine Experience One of the common pain-points we have heard from customers using the preview of our new Virtual Machine support has been the inability to delete the associated VHDs when a VM instance (or VM drive) gets deleted. Prior to today’s release the VHDs would continue to be in your storage account and accumulate storage charges. You can now navigate to the Disks tab within the Virtual Machine extension, select a VM disk to delete, and click the DELETE DISK command: When you click the DELETE DISK button you have the option to delete the disk + associated .VHD file (completely clearing it from storage).  Alternatively you can delete the disk but still retain a .VHD copy of it in storage. Improved Cloud Service Status Notifications The Windows Azure portal now exposes more information of the health status of role instances.  If any of the instances are in a non-running state, the status at the top of the dashboard will summarize the status (and update automatically as the role health changes): Clicking the instance hyperlink within this status summary view will navigate you to a detailed role instance view, and allow you to get more detailed health status of each of the instances.  The portal has been updated to provide more specific status information within this detailed view – giving you better visibility into the health of your app: Monitoring Support for Media Services Windows Azure Media Services allows you to create media processing jobs (for example: encoding media files) in your Windows Azure Media Services account. In the Windows Azure Portal, you can now monitor the number of encoding jobs that are queued up for processing as well as active, failed and queued tasks for encoding jobs. On your media services account dashboard, you can visualize the monitoring data for last 6 hours, 24 hours or 7 days. Storage Container Creation and Access Control Support You can now create Windows Azure Storage storage containers from within the Windows Azure Portal.  After selecting a storage account, you can navigate to the CONTAINERS tab and click the ADD CONTAINER command: This will display a dialog that lets you name the new container and control access to it: You can also update the access setting as well as container metadata of existing containers by selecting one and then using the new EDIT CONTAINER command: This will then bring up the edit container dialog that allows you to change and save its settings: In addition to creating and editing containers, you can click on them within the portal to drill-in and view blobs within them.  Summary The above features are all now live in production and available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today.  Visit the Windows Azure Developer Center to learn more about how to build apps with it. We’ll have even more new features and enhancements coming later this month – including support for the recent Windows Server 2012 and .NET 4.5 releases (we will enable new web and worker role images with Windows Server 2012 and .NET 4.5, and support .NET 4.5 with Websites).  Keep an eye out on my blog for details as these new features become available. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Sign on Experience with Office 365

    - by Sahil Malik
    SharePoint 2010 Training: more information Office 365 offers two types of identities: · Microsoft Online Services cloud IDs (Cloud Identity): This is the default identity Microsoft provides you, requires no additional setup, you sign up for Office 365 and you are provided a credential. You can sign in using forms based authentication, the password policy etc. for which is stored in the cloud with the Office 365 service. The advantage obviously is no additional setup headache. The disadvantage? Yet another password to remember, and no hope of authenticated single sign on integration using this cloud identity with other services at least in the current version. · Federated IDs (Federated Identity): In companies with on-premises Active Directory, users can sign into Office 365 services using their Active Directory credentials. The corporate Active Directory authenticates the users, and stores and controls the password policy. The advantage here is plenty of single sign on possibilities and better user experience. The downside, more Read full article ....

    Read the article

  • Docky have stopped working since update

    - by Fraekkert
    I'm running Ubuntu 10.10 64-bit. I'm using the Docky ppa, and since the latest update It won't start. If i run it from the terminal, this is what i get: [Info 09:21:19.005] Docky version: 2.1.0 bzr docky r1761 ppa [Info 09:21:19.024] Kernel version: 2.6.35.24 [Info 09:21:19.026] CLR version: 2.0.50727.1433 [Debug 09:21:19.493] [UserArgs] BufferTime = 0 [Debug 09:21:19.494] [UserArgs] MaxSize = 2147483647 [Debug 09:21:19.494] [UserArgs] NetbookMode = False [Debug 09:21:19.494] [UserArgs] NoPollCursor = False [Debug 09:21:19.528] [SystemService] Using org.freedesktop.UPower for battery information [Info 09:21:19.564] [ThemeService] Setting theme: Transparent [Debug 09:21:19.587] [DesktopItemService] Loading remap file '/usr/share/docky/remaps.ini'. [Debug 09:21:19.599] [DesktopItemService] Remapping 'Picasa3.exe' to 'picasa'. [Debug 09:21:19.599] [DesktopItemService] Remapping 'nbexec' to 'netbeans'. [Debug 09:21:19.599] [DesktopItemService] Remapping 'deja-dup-preferences' to 'deja-dup'. [Debug 09:21:19.599] [DesktopItemService] Remapping 'VirtualBox' to 'virtualbox'. [Warn 09:21:19.600] [DesktopItemService] Could not find remap file '/home/lasse/.local/share/docky/remaps.ini'! [Debug 09:21:19.602] [DesktopItemService] Loading desktop item cache '/home/lasse/.cache/docky/docky.desktop.en_DK.utf8.cache'. [Info 09:21:20.101] [DockServices] Dock services initialized. [Debug 09:21:20.134] [DBusManager] DBus Registered: org.gnome.Docky [Debug 09:21:20.142] [DBusManager] DBus Registered: net.launchpad.DockManager Stacktrace: at (wrapper managed-to-native) System.IO.MonoIO.Read (intptr,byte[],int,int,System.IO.MonoIOError&) <IL 0x00012, 0x00062> at (wrapper managed-to-native) System.IO.MonoIO.Read (intptr,byte[],int,int,System.IO.MonoIOError&) <IL 0x00012, 0x00062> at System.IO.FileStream.ReadData (intptr,byte[],int,int) <IL 0x00009, 0x00047> at System.IO.FileStream.RefillBuffer () <IL 0x0001c, 0x0002b> at System.IO.FileStream.ReadByte () <IL 0x00079, 0x000c7> at Mono.Addins.Serialization.BinaryXmlReader.ReadNext () <IL 0x0000b, 0x00031> at Mono.Addins.Serialization.BinaryXmlReader.Skip () <IL 0x0003c, 0x00053> at Mono.Addins.Serialization.BinaryXmlReader.Skip () <IL 0x00047, 0x0005f> at Mono.Addins.Serialization.BinaryXmlReader.Skip () <IL 0x00047, 0x0005f> And this .Skip () continues infinitely, and very fast. I've tried cleaning the cache and reinstalling docky, but without luck.

    Read the article

  • Social Analytics and the Customer

    - by David Dorf
    Many successful retailers put the customer at the center of everything they do, so its important that the customer is modeled correctly across all their systems.  The path to omni-channel starts and ends with the customer so at ARTS, our next big project is focused on ensuring a consistent representation of customers across our transactional data model, datawarehouse model, and XML schemas.  Further, we've started a new whitepaper that describes how Big Data and Social Media Analytics should be leveraged by retailers to add and additional level of customer insight. Let's start by taking a closer look at the meaning of social analytics.  Here's my definition: Social Analytics, in the retail context, describes the analysis of data obtained from social media sources in an effort to better comprehend and interact with the community of consumers.  This discipline seeks to understand what’s being said by the community about brands and products (“monitoring”), as well as understand the behaviors of those in the community (“profiling”).  The results are used to enforce the brand image, improve product decisions, and better focus marketing, all of which lead to increased sales. To help illustrate the facets of social analytics, I drew the diagram below which was originally published by Retail Touchpoints. There are lots of tools on the market that allow retailers to monitor social media for brand and product mentions.  These include analysis of sentiment, reach, share of voice, engagement, etc.  When your brand is mentioned, good or bad, its an opportunity to engage with the customer and possibly lead to a sale.  Because products are not always unique, its much more difficult to monitor product mentions, but detecting product trends early can help a retailer make better merchandising decisions, especially in fashion. Once a retailer understands what's being said, the next step is learn more about who's saying it.  That involves profiling customers beyond simple demographics to understand their motivations.  Much can be learned from patterns, and even more when customers voluntarily share their data.  Knowing that a customer is passionate about, for example, mountain biking allows the retailer to make relevant offers on helmets, ask for opinions on hydration, and help spread marketing messages. Social analytics has many facets that benefit retailers, some of which are easy but many of which are hard.  Its important for the CMO and CIO to work closely together to plan for these capabilities and monitor the maturity of tools on the market.  This is an area that will separate winners from losers.

    Read the article

< Previous Page | 128 129 130 131 132 133 134 135 136 137 138 139  | Next Page >