Search Results

Search found 4711 results on 189 pages for 'documents'.

Page 171/189 | < Previous Page | 167 168 169 170 171 172 173 174 175 176 177 178  | Next Page >

  • Preview and Purchase Ebooks with Kindle for PC

    - by Matthew Guay
    Want to look over a new book, or buy it immediately in ebook format?  Here’s how you can preview and purchase most new books from your PC the easy way. Most new books, including almost all New York Times Bestsellers, are available in ebook format from Amazon’s Kindle store.  The Kindle store also includes numerous free ebooks, including out-of-print classics and a surprising amount of recent books.  With the free Kindle for PC reader, you can read any of these ebooks without having to purchase a Kindle device. Preview Ebooks Before you Purchase Sometimes, it can be hard to know if you want to purchase a new book without reading some of it first.  With Kindle for PC, however, you can download a sample of any ebook available for free.  The sample usually includes the table of contents, forward or introduction, and often part or all of the first chapter. To get an ebook sample, find the book you want in the Kindle store (link below). Now, under the Try it free box, select the correct computer or device to send the sample to, and click Send Sample now. Amazon will thank you for your order, even though this is only a free preview.  Click the Go to Kindle for PC button to open Kindle and read your ebook preview.   Or, if Kindle is already running, press the Refresh button in the top right corner to check for new ebooks and previews. Kindle will synchronize and download the previews you selected. The most recently downloaded items show up on the top left.  All sample books have a red “Sample” bar on the bottom of their cover, and they also include links to Buy or view more info about it on it’s cover.  Double-click your sample to start reading it. Your ebook sample will usually open at the introduction or beginning of the first chapter, but you can also view the index, cover, and more. When you reach the end of the sample book, you can click a link to buy the book or view more details about it.  Strangely, both of these links currently take you to the ebook’s page on Amazon.com, but perhaps in the future the Buy link will directly let you purchase the book. Or, you can also click Buy Now on a sample book directly from your Kindle library. If you clicked one of these links, you will be returned to the ebook’s page on Amazon.  Choose the PC or Kindle you want the book delivered to, and this time, select Buy Now with 1-Click. Add your payment info if you’re not already setup for 1-Click Shopping, and then you’ll be shown the same Thank you page as before.  Refresh Kindle for PC, and your new ebook will automatically download.  Strangely, the sample ebook is not automatically removed, so you can right-click on the sample and select Delete this Book.  Additionally, your last-read page in the sample is not synced to the purchased book, so you may have to find your place again. Now, enjoy your full ebook! Download Free Books for Kindle The Kindle Store has an amazing amount of free ebooks.  Some free books may only be free for a limited time as a promotion, while others, such as old classics, may always be free.  Either which way, once you download it, you can keep it forever. When you find a free ebook you want, select the Kindle or PC you want to download it to and click “Buy now with 1-Click”.  Notice that this book shows it’s price is $0.00, but the button still says Buy now.  Rest assured, if the book’s price show up as $0.00, you will not be charged anything for downloading it. Your ebook will download as usual after your next refresh.  Note that you can still download the sample first if you want, but since the book is free, just download the whole thing and delete it if you don’t want it. Redownload your Purchased or Free Books If you install Kindle on a new PC or delete a book from your library, you can always re-download it from your Amazon account.  Browse to the Manage your Kindle page on Amazon (link below) sign in with your Amazon account, and scroll down to the list of your purchased content. Select the book you wish to download, then choose the Kindle or PC you want to download it to and press Go. Note: There is a “Delete this title” button right below this.  If you press the Delete button, you will not ever be able to re-download it. Or, you can download the book directly from the Archived Items tab in Kindle on your other PC. And, if you have your Kindle content on multiple computers, your reading will be synced via Whispersync.  You can start reading on your desktop, and then resume where you left off from your laptop. Conclusion With these tips and tricks, it is much easier to preview and purchase new books, find and download free ebooks, and re-download any you’ve deleted from your PC.  Have fun filling up your digital library! Links Manage your Kindle account Similar Articles Productive Geek Tips Read Mobi eBooks on Kindle for PCRead Kindle Books On Your Computer with Kindle for PCHow to See Where a TinyUrl Is Really Linking ToEdit Microsoft Word 2007 Documents in Print PreviewWhy Can’t I Turn the Details/Preview Panes On or Off in Windows Vista Explorer? TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Whoa ! Use Printflush to Solve Printing Problems Icelandic Volcano Webcams Open Multiple Links At One Go NachoFoto Searches Images in Real-time Office 2010 Product Guides

    Read the article

  • GitHub Integration in Windows Azure Web Site

    - by Shaun
    Microsoft had just announced an update for Windows Azure Web Site (a.k.a. WAWS). There are four major features added in WAWS which are free scaling mode, GitHub integration, custom domain and multi branches. Since I ‘m working in Node.js and I would like to have my code in GitHub and deployed automatically to my Windows Azure Web Site once I sync my code, this feature is a big good news to me.   It’s very simple to establish the GitHub integration in WAWS. First we need a clean WAWS. In its dashboard page click “Set up Git publishing”. Currently WAWS doesn’t support to change the publish setting. So if you have an existing WAWS which published by TFS or local Git then you have to create a new WAWS and set the Git publishing. Then in the deployment page we can see now WAWS supports three Git publishing modes: - Push my local files to Windows Azure: In this mode we will create a new Git repository on local machine and commit, publish our code to Windows Azure through Git command or some GUI. - Deploy from my GitHub project: In this mode we will have a Git repository created on GitHub. Once we publish our code to GitHub Windows Azure will download the code and trigger a new deployment. - Deploy from my CodePlex project: Similar as the previous one but our code would be in CodePlex repository.   Now let’s back to GitHub and create a new publish repository. Currently WAWS GitHub integration only support for public repositories. The private repositories support will be available in several weeks. We can manage our repositories in GitHub website. But as a windows geek I prefer the GUI tool. So I opened the GitHub for Windows, login with my GitHub account and select the “github” category, click the “add” button to create a new repository on GitHub. You can download the GitHub for Windows here. I specified the repository name, description, local repository, do not check the “Keep this code private”. After few seconds it will create a new repository on GitHub and associate it to my local machine in that folder. We can find this new repository in GitHub website. And in GitHub for Windows we can also find the local repository by selecting the “local” category.   Next, we need to associate this repository with our WAWS. Back to windows developer portal, open the “Deploy from my GitHub project” in the deployment page and click the “Authorize Windows Azure” link. It will bring up a new windows on GitHub which let me allow the Windows Azure application can access your repositories. After we clicked “Allow”, windows azure will retrieve all my GitHub public repositories and let me select which one I want to integrate to this WAWS. I selected the one I had just created in GitHub for Windows. So that’s all. We had completed the GitHub integration configuration. Now let’s have a try. In GitHub for Windows, right click on this local repository and click “open in explorer”. Then I added a simple HTML file. 1: <html> 2: <head> 3: </head> 4: <body> 5: <h1> 6: I came from GitHub, WOW! 7: </h1> 8: </body> 9: </html> Save it and back to GitHub for Windows, commit this change and publish. This will upload our changes to GitHub, and Windows Azure will detect this update and trigger a new deployment. If we went back to azure developer portal we can find the new deployment. And our commit message will be shown as the deployment description as well. And here is the page deployed to WAWS.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • How to Sync Any Folder With SkyDrive on Windows 8.1

    - by Chris Hoffman
    Before Windows 8.1, it was possible to sync any folder on your computer with SkyDrive using symbolic links. This method no longer works now that SkyDrive is baked into Windows 8.1, but there are other tricks you can use. Creating a symbolic link or directory junction inside your SkyDrive folder will give you an empty folder in your SkyDrive cloud storage. Confusingly, the files will appear inside the SkyDrive Modern app as if they were being synced, but they aren’t. The Solution With SkyDrive refusing to understand and accept symbolic links in its own folder, the best option is probably to use symbolic links anyway — but in reverse. For example, let’s say you have a program that automatically saves important data to a folder anywhere on your hard drive — whether it’s C:\Users\USER\Documents\, C:\Program\Data, or anywhere else. Rather than trying to trick SkyDrive into understanding a symbolic link, we could instead move the actual folder itself to SkyDrive and then use a symbolic link at the folder’s original location to trick the original program. This may not work for every single program out there. But it will likely work for most programs, which use standard Windows API calls to access folders and save files. We’re just flipping the old solution here — we can’t trick SkyDrive anymore, so let’s try to trick other programs instead. Moving a Folder and Creating a Symbolic Link First, ensure no program is using the external folder. For example, if it’s a program data or settings folder, close the program that’s using the folder. Next, simply move the folder to your SkyDrive folder. Right-click the external folder, select Cut, go to the SkyDrive folder, right-click and select Paste. The folder will now be located in the SkyDrive folder itself, so it will sync normally. Next, open a Command Prompt window as Administrator. Right-click the Start button on the taskbar or press Windows Key + X and select Command Prompt (Administrator) to open it. Run the following command to create a symbolic link at the original location of the folder: mklink /d “C:\Original\Folder\Location” “C:\Users\NAME\SkyDrive\FOLDERNAME\” Enter the correct paths for the exact location of the original folder and the current location of the folder in your SkyDrive. Windows will then create a symbolic link at the folder’s original location. Most programs should hopefully be tricked by this symbolic location, saving their files directly to SkyDrive. You can test this yourself. Put a file into the folder at its original location. It will be saved to SkyDrive and sync normally, appearing in your SkyDrive storage online. One downside here is that you won’t be able to save a file onto SkyDrive without it taking up space on the same hard drive SkyDrive is on. You won’t be able to scatter folders across multiple hard drives and sync them all. However, you could always change the location of the SkyDrive folder on Windows 8.1 and put it on a drive with a larger amount of free space. To do this, right-click the SkyDrive folder in File Explorer, select Properties, and use the options on the Location tab. You could even use Storage Spaces to combine the drives into one larger drive. Automatically Copy the Original Files to SkyDrive Another option would be to run a program that automatically copies files from another folder on your computer to your SkyDrive folder. For example, let’s say you want to sync copies of important log files that a program creates in a specific folder. You could use a program that allows you to schedule automatic folder-mirroring, configuring the program to regularly copy the contents of your log folder to your SkyDrive folder. This may be a useful alternative for some use cases, although it isn’t the same as standard syncing. You’ll end up with two copies of the files taking up space on your system, which won’t be ideal for large files. The files also won’t be instantly uploaded to your SkyDrive storage after they’re created, but only after the scheduled task runs. There are many options for this, including Microsoft’s own SyncToy, which continues to work on Windows 8. If you were using the symbolic link trick to automatically sync copies of PC game save files with SkyDrive, you could just install GameSave Manager. It can be configured to automatically create backup copies of your computer’s PC game save files on a schedule, saving them to SkyDrive where they’ll be synced and backed up online. SkyDrive support was completely rewritten for Windows 8.1, so it’s not surprising that this trick no longer works. The ability to use symbolic links in previous versions of SkyDrive was never officially supported, so it’s not surprising to see it break after a rewrite. None of the methods above are as convenient and quick as the old symbolic link method, but they’re the best we can do with the SkyDrive integration Microsoft has given us in Windows 8.1. It’s still possible to use symbolic links to easily sync other folders with competing cloud storage services like Dropbox and Google Drive, so you may want to consider switching away from SkyDrive if this feature is critical to you.     

    Read the article

  • Html.RenderAction Failed when Validation Failed

    - by Shaun
    RenderAction method had been introduced when ASP.NET MVC 1.0 released in its MvcFuture assembly and then final announced along with the ASP.NET MVC 2.0. Similar as RenderPartial, the RenderAction can display some HTML markups which defined in a partial view in any parent views. But the RenderAction gives us the ability to populate the data from an action which may different from the action which populating the main view. For example, in Home/Index.aspx we can invoke the Html.RenderPartial(“MyPartialView”) but the data of MyPartialView must be populated by the Index action of the Home controller. If we need the MyPartialView to be shown in Product/Create.aspx we have to copy (or invoke) the relevant code from the Index action in Home controller to the Create action in the Product controller which is painful. But if we are using Html.RenderAction we can tell the ASP.NET MVC from which action/controller the data should be populated. in that way in the Home/Index.aspx and Product/Create.aspx views we just need to call Html.RenderAction(“CreateMyPartialView”, “MyPartialView”) so it will invoke the CreateMyPartialView action in MyPartialView controller regardless from which main view. But in my current project we found a bug when I implement a RenderAction method in the master page to show something that need to connect to the backend data center when the validation logic was failed on some pages. I created a sample application below.   Demo application I created an ASP.NET MVC 2 application and here I need to display the current date and time on the master page. I created an action in the Home controller named TimeSlot and stored the current date into ViewDate. This method was marked as HttpGet as it just retrieves some data instead of changing anything. 1: [HttpGet] 2: public ActionResult TimeSlot() 3: { 4: ViewData["timeslot"] = DateTime.Now; 5: return View("TimeSlot"); 6: } Next, I created a partial view under the Shared folder to display the date and time string. 1: <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<dynamic>" %> 2:  3: <span>Now: <% 1: : ViewData["timeslot"].ToString() %></span> Then at the master page I used Html.RenderAction to display it in front of the logon link. 1: <div id="logindisplay"> 2: <% 1: Html.RenderAction("TimeSlot", "Home"); %> 3:  4: <% 1: Html.RenderPartial("LogOnUserControl"); %> 5: </div> It’s fairly simple and works well when I navigated to any pages. But when I moved to the logon page and click the LogOn button without input anything in username and password the validation failed and my website crashed with the beautiful yellow page. (I really like its color style and fonts…)   How ASP.NET MVC executes Html.RenderAction In this example all other pages were rendered successful which means the ASP.NET MVC found the TimeSolt action under the Home controller except this situation. The only different is that when I clicked the LogOn button the browser send an HttpPost request to the server. Is that the reason of this bug? I created another action in Home controller with the same action name but for HttpPost. 1: [HttpPost] 2: [ActionName("TimeSlot")] 3: public ActionResult TimeSlot(object dummy) 4: { 5: return TimeSlot(); 6: } Or, I can use the AcceptVerbsAttribute on the TimeSlot action to let it allow both HttpGet and HttpPost. 1: [AcceptVerbs("GET", "POST")] 2: public ActionResult TimeSlot() 3: { 4: ViewData["timeslot"] = DateTime.Now; 5: return View("TimeSlot"); 6: } And then repeat what I did before and this time it worked well. Why we need the action for HttpPost here as it’s just data retrieving? That is because of how ASP.NET MVC executes the RenderAction method. In the source code of ASP.NET MVC we can see when proforming the RenderAction ASP.NET MVC creates a RequestContext instance from the current RequestContext and created a ChildActionMvcHandler instance which inherits from MvcHandler class. Then the ASP.NET MVC processes the handler through the HttpContext.Server.Execute method. That means it performs the action as a stand-alone request asynchronously and flush the result into the  TextWriter which is being used to render the current page. Since when I clicked the LogOn the request was in HttpPost so when ASP.NET MVC processed the ChildActionMvcHandler it would find the action which allow the current request method, which is HttpPost. Then our TimeSlot method in HttpGet would not be matched.   Summary In this post I introduced a bug in my currently developing project regards the new Html.RenderAction method provided within ASP.NET MVC 2 when processing a HttpPost request. In ASP.NET MVC world the underlying Http information became more important than in ASP.NET WebForm world. We need to pay more attention on which kind of request it currently created and how ASP.NET MVC processes.   Hope this helps, Shaun   All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • International Radio Operators Alphabet in F# &amp; Silverlight &ndash; Part 2

    - by MarkPearl
    So the brunt of my my very complex F# code has been done. Now it’s just putting the Silverlight stuff in. The first thing I did was add a new project to my solution. I gave it a name and VS2010 did the rest of the magic in creating the .Web project etc. In this instance because I want to take the MVVM approach and make use of commanding I have decided to make the frontend a Silverlight4 project. I now need move my F# code into a proper Silverlight Library. Warning – when you create the Silverlight Library VS2010 will ask you whether you want it to be based on Silverlight3 or Silverlight4. I originally went for Silverlight4 only to discover when I tried to compile my solution that I was given an error… Error 12 F# runtime for Silverlight version v4.0 is not installed. Please go to http://go.microsoft.com/fwlink/?LinkId=177463 to download and install matching.. After asking around I discovered that the Silverlight4 F# runtime is not available yet. No problem, the suggestion was to change the F# Silverlight Library to a Silverlight3 project however when going to the properties of the project file – even though I changed it to Silverlight3, VS2010 did not like it and kept reverting it to a Silverlight4 project. After a few minutes of scratching my head I simply deleted Silverlight4 F# Library project and created a new F# Silverlight Library project in Silverlight3 and VS2010 was happy. Now that the project structure is set up, rest is fairly simple. You need to add the Silverlight Library as a reference to the C# Silverlight Front End. Then setup your views, since I was following the MVVM pattern I made a Views & ViewModel folder and set up the relevant View and ViewModels. The MainPageViewModel file looks as follows using System; using System.Net; using System.Windows; using System.Windows.Controls; using System.Windows.Documents; using System.Windows.Ink; using System.Windows.Input; using System.Windows.Media; using System.Windows.Media.Animation; using System.Windows.Shapes; using System.Collections.ObjectModel; namespace IROAFrontEnd.ViewModels { public class MainPageViewModel : ViewModelBase { private string _iroaString; private string _inputCharacters; public string InputCharacters { get { return _inputCharacters; } set { if (_inputCharacters != value) { _inputCharacters = value; OnPropertyChanged("InputCharacters"); } } } public string IROAString { get { return _iroaString; } set { if (_iroaString != value) { _iroaString = value; OnPropertyChanged("IROAString"); } } } public ICommand MySpecialCommand { get { return new MyCommand(this); } } public class MyCommand : ICommand { readonly MainPageViewModel _myViewModel; public MyCommand(MainPageViewModel myViewModel) { _myViewModel = myViewModel; } public event EventHandler CanExecuteChanged; public bool CanExecute(object parameter) { return true; } public void Execute(object parameter) { var result = ModuleMain.ConvertCharsToStrings(_myViewModel.InputCharacters); var newString = ""; foreach (var Item in result) { newString += Item + " "; } _myViewModel.IROAString = newString.Trim(); } } } } One of the features I like in Silverlight4 is the new commanding. You will notice in my I have put the code under the command execute to reference to my F# module. At the moment this could be cleaned up even more, but will suffice for now.. public void Execute(object parameter) { var result = ModuleMain.ConvertCharsToStrings(_myViewModel.InputCharacters); var newString = ""; foreach (var Item in result) { newString += Item + " "; } _myViewModel.IROAString = newString.Trim(); } I then needed to set the view up. If we have a look at the MainPageView.xaml the xaml code will look like the following…. Nothing to fancy, but battleship grey for now… take careful note of the binding of the command in the button to MySpecialCommand which was created in the ViewModel. <UserControl x:Class="IROAFrontEnd.Views.MainPageView" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d" d:DesignHeight="300" d:DesignWidth="400"> <Grid x:Name="LayoutRoot" Background="White"> <Grid.RowDefinitions> <RowDefinition/> <RowDefinition/> <RowDefinition/> </Grid.RowDefinitions> <TextBox Grid.Row="0" Text="{Binding InputCharacters, Mode=TwoWay}"/> <Button Grid.Row="1" Command="{Binding MySpecialCommand}"> <TextBlock Text="Generate"/> </Button> <TextBlock Grid.Row="2" Text="{Binding IROAString}"/> </Grid> </UserControl> Finally in the App.xaml.cs file we need to set the View and link it to the ViewModel. private void Application_Startup(object sender, StartupEventArgs e) { var myView = new MainPageView(); var myViewModel = new MainPageViewModel(); myView.DataContext = myViewModel; this.RootVisual = myView; }   Once this is done – hey presto – it worked. I typed in some “Test Input” and clicked the generate button and the correct Radio Operators Alphabet was generated. And that’s the end of my first very basic F# Silverlight application.

    Read the article

  • Ranking - an Introduction

    - by PointsToShare
    © 2011 By: Dov Trietsch. All rights reserved Ranking Ranking is quite common in the internet. Readers are asked to rank their latest reading by clicking on one of 5 (sometimes 10) stars. The number of stars is then converted to a number and the average number of stars as selected by all the readers is proudly (or shamefully) displayed for future readers. SharePoint 2007 lacked this feature altogether. SharePoint 2010 allows the users to rank items in a list or documents in a library (the two are actually the same because a library is actually a list). But in SP2010 the computation of the average is done later on a timer rather than on-the-spot as it should be. I suspect that the reason for this shortcoming is that they did not involve a mathematician! Let me explain. Ranking is kept in a related list. When a user rates a document the rank-list is added an item with the item id, the user name, and his number of stars. The fact that a user already ranked an item prevents him from ranking it again. This prevents the creator of the item from asking his mother to rank it a 5 and do it 753 times, thus stacking the ballot. Some systems will allow a user to change his rating and this will be done by updating the rank-list item. Now, when the timer kicks off, the list is spanned and for each item the rank-list items containing this id are summed up and divided by the number of votes thus yielding the new average. This is obviously very time consuming and very server intensive. In the 18th century an early actuary named James Dodson used what the great Augustus De Morgan (of De Morgan’s law) later named Commutation tables. The labor involved in computing a life insurance premium was staggering and also very error prone. Clerks with pencil and paper would multiply and add mountains of numbers to do the task. The more steps the greater the probability of error and the more expensive the process. Commutation tables created a “summary” of many steps and reduced the work 100 fold. So had Microsoft taken a lesson in the history of computation, they would have developed a much faster way for rating that may be done in real-time and is also 100 times faster and less CPU intensive. How do we do this? We use a form of commutation. We always keep the number of votes and the total of stars. One simple division gives us the average. So we write an event receiver. When a vote is added, we just add the stars to the total-stars and 1 to the number of votes. We then recomputed the average. When a vote is updated, we reduce the total by the old vote, increase it by the new vote and leave the number of votes the same. Again we do the division to get the new average. When a vote is deleted (highly unlikely and maybe even prohibited), we reduce the total by that vote and reduce the number of votes by 1… Gone are the days of spanning lists, counting items, and tallying votes and we have no need for a timer process to run it all. This is the first of a few treatises on ranking. Even though I discussed the math and the history thereof, in here I am only going to solve the presentation issue. I wanted to create the CSS and Jscript needed to display the stars, create the various effects like hovering and clicking (onmouseover, onmouseout, onclick, etc.) and I wanted to create a general solution with any number of stars. When I had it all done, I created the ranking game so that I could test it. The game is interesting in and on itself, so here it is (or go to the games page and select “rank the stars”). BTW, when you play it, look at the source code and see how it was all done.  Next, how the 5 stars are displayed in the New and Update forms. When the whole set of articles will be done, you’ll be able to create the complete solution. That’s all folks!

    Read the article

  • How to Share Files Between User Accounts on Windows, Linux, or OS X

    - by Chris Hoffman
    Your operating system provides each user account with its own folders when you set up several different user accounts on the same computer. Shared folders allow you to share files between user accounts. This process works similarly on Windows, Linux, and Mac OS X. These are all powerful multi-user operating systems with similar folder and file permission systems. Windows On Windows, the “Public” user’s folders are accessible to all users. You’ll find this folder under C:\Users\Public by default. Files you place in any of these folders will be accessible to other users, so it’s a good way to share music, videos, and other types of files between users on the same computer. Windows even adds these folders to each user’s libraries by default. For example, a user’s Music library contains the user’s music folder under C:\Users\NAME\as well as the public music folder under C:\Users\Public\. This makes it easy for each user to find the shared, public files. It also makes it easy to make a file public — just drag and drop a file from the user-specific folder to the public folder in the library. Libraries are hidden by default on Windows 8.1, so you’ll have to unhide them to do this. These Public folders can also be used to share folders publically on the local network. You’ll find the Public folder sharing option under Advanced sharing settings in the Network and Sharing Control Panel. You could also choose to make any folder shared between users, but this will require messing with folder permissions in Windows. To do this, right-click a folder anywhere in the file system and select Properties. Use the options on the Security tab to change the folder’s permissions and make it accessible to different user accounts. You’ll need administrator access to do this. Linux This is a bit more complicated on Linux, as typical Linux distributions don’t come with a special user folder all users have read-write access to. The Public folder on Ubuntu is for sharing files between computers on a network. You can use Linux’s permissions system to give other user accounts read or read-write access to specific folders. The process below is for Ubuntu 14.04, but it should be identical on any other Linux distribution using GNOME with the Nautilus file manager. It should be similar for other desktop environments, too. Locate the folder you want to make accessible to other users, right-click it, and select Properties. On the Permissions tab, give “Others” the “Create and delete files” permission. Click the Change Permissions for Enclosed Files button and give “Others” the “Read and write” and “Create and Delete Files” permissions. Other users on the same computer will then have read and write access to your folder. They’ll find it under /home/YOURNAME/folder under Computer. To speed things up, they can create a link or bookmark to the folder so they always have easy access to it. Mac OS X Mac OS X creates a special Shared folder that all user accounts have access to. This folder is intended for sharing files between different user accounts. It’s located at /Users/Shared. To access it, open the Finder and click Go > Computer. Navigate to Macintosh HD > Users > Shared. Files you place in this folder can be accessed by any user account on your Mac. These tricks are useful if you’re sharing a computer with other people and you all have your own user accounts — maybe your kids have their own limited accounts. You can share a music library, downloads folder, picture archive, videos, documents, or anything else you like without keeping duplicate copies.

    Read the article

  • Goodbye XML&hellip; Hello YAML (part 2)

    - by Brian Genisio's House Of Bilz
    Part 1 After I explained my motivation for using YAML instead of XML for my data, I got a lot of people asking me what type of tooling is available in the .Net space for consuming YAML.  In this post, I will discuss a nice tooling option as well as describe some small modifications to leverage the extremely powerful dynamic capabilities of C# 4.0.  I will be referring to the following YAML file throughout this post Recipe: Title: Macaroni and Cheese Description: My favorite comfort food. Author: Brian Genisio TimeToPrepare: 30 Minutes Ingredients: - Name: Cheese Quantity: 3 Units: cups - Name: Macaroni Quantity: 16 Units: oz Steps: - Number: 1 Description: Cook the macaroni - Number: 2 Description: Melt the cheese - Number: 3 Description: Mix the cooked macaroni with the melted cheese Tooling It turns out that there are several implementations of YAML tools out there.  The neatest one, in my opinion, is YAML for .NET, Visual Studio and Powershell.  It includes a great editor plug-in for Visual Studio as well as YamlCore, which is a parsing engine for .Net.  It is in active development still, but it is certainly enough to get you going with YAML in .Net.  Start by referenceing YamlCore.dll, load your document, and you are on your way.  Here is an example of using the parser to get the title of the Recipe: var yaml = YamlLanguage.FileTo("Data.yaml") as Hashtable; var recipe = yaml["Recipe"] as Hashtable; var title = recipe["Title"] as string; In a similar way, you can access data in the Ingredients set: var yaml = YamlLanguage.FileTo("Data.yaml") as Hashtable; var recipe = yaml["Recipe"] as Hashtable; var ingredients = recipe["Ingredients"] as ArrayList; foreach (Hashtable ingredient in ingredients) { var name = ingredient["Name"] as string; } You may have noticed that YamlCore uses non-generic Hashtables and ArrayLists.  This is because YamlCore was designed to work in all .Net versions, including 1.0.  Everything in the parsed tree is one of two things: Hashtable, ArrayList or Value type (usually String).  This translates well to the YAML structure where everything is either a Map, a Set or a Value.  Taking it further Personally, I really dislike writing code like this.  Years ago, I promised myself to never write the words Hashtable or ArrayList in my .Net code again.  They are ugly, mostly depreciated collections that existed before we got generics in C# 2.0.  Now, especially that we have dynamic capabilities in C# 4.0, we can do a lot better than this.  With a relatively small amount of code, you can wrap the Hashtables and Array lists with a dynamic wrapper (wrapper code at the bottom of this post).  The same code can be re-written to look like this: dynamic doc = YamlDoc.Load("Data.yaml"); var title = doc.Recipe.Title; And dynamic doc = YamlDoc.Load("Data.yaml"); foreach (dynamic ingredient in doc.Recipe.Ingredients) { var name = ingredient.Name; } I significantly prefer this code over the previous.  That’s not all… the magic really happens when we take this concept into WPF.  With a single line of code, you can bind to the data dynamically in the view: DataContext = YamlDoc.Load("Data.yaml"); Then, your XAML is extremely straight-forward (Nothing else.  No static types, no adapter code.  Nothing): <StackPanel> <TextBlock Text="{Binding Recipe.Title}" /> <TextBlock Text="{Binding Recipe.Description}" /> <TextBlock Text="{Binding Recipe.Author}" /> <TextBlock Text="{Binding Recipe.TimeToPrepare}" /> <TextBlock Text="Ingredients:" FontWeight="Bold" /> <ItemsControl ItemsSource="{Binding Recipe.Ingredients}" Margin="10,0,0,0"> <ItemsControl.ItemTemplate> <DataTemplate> <StackPanel Orientation="Horizontal"> <TextBlock Text="{Binding Quantity}" /> <TextBlock Text=" " /> <TextBlock Text="{Binding Units}" /> <TextBlock Text=" of " /> <TextBlock Text="{Binding Name}" /> </StackPanel> </DataTemplate> </ItemsControl.ItemTemplate> </ItemsControl> <TextBlock Text="Steps:" FontWeight="Bold" /> <ItemsControl ItemsSource="{Binding Recipe.Steps}" Margin="10,0,0,0"> <ItemsControl.ItemTemplate> <DataTemplate> <StackPanel Orientation="Horizontal"> <TextBlock Text="{Binding Number}" /> <TextBlock Text=": " /> <TextBlock Text="{Binding Description}" /> </StackPanel> </DataTemplate> </ItemsControl.ItemTemplate> </ItemsControl> </StackPanel> This nifty XAML binding trick only works in WPF, unfortunately.  Silverlight handles binding differently, so they don’t support binding to dynamic objects as of late (March 2010).  This, in my opinion, is a major lacking feature in Silverlight and I really hope we will see this feature available to us in Silverlight 4 Release.  (I am not very optimistic for Silverlight 4, but I can hope for the feature in Silverlight 5, can’t I?) Conclusion I still have a few things I want to say about using YAML in the .Net space including de-serialization and using IronRuby for your YAML parser, but this post is hopefully enough to see how easy it is to incorporate YAML documents in your code. Codeplex Site for YAML tools Dynamic wrapper for YamlCore

    Read the article

  • Remote Debug Windows Azure Cloud Service

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/11/02/remote-debug-windows-azure-cloud-service.aspxOn the 22nd of October Microsoft Announced the new Windows Azure SDK 2.2. It introduced a lot of cool features but one of it shocked most, which is the remote debug support for Windows Azure Cloud Service (a.k.a. WACS).   Live Debug is Nightmare for Cloud Application When we are developing against public cloud, debug might be the most difficult task, especially after the application had been deployed. In order to minimize the debug effort, Microsoft provided local emulator for cloud service and storage once the Windows Azure platform was announced. By using local emulator developers could be able run their application on local machine with almost the same behavior as running on Windows Azure, and that could be debug easily and quickly. But when we deployed our application to Azure, we have to use log, diagnostic monitor to debug, which is very low efficient. Visual Studio 2012 introduced a new feature named "anonymous remote debug" which allows any workstation under any user could be able to attach the remote process. This is less secure comparing the authenticated remote debug but much easier and simpler to use. Now in Windows Azure SDK 2.2, we could be able to attach our application from our local machine to Windows Azure, and it's very easy.   How to Use Remote Debugger First, let's create a new Windows Azure Cloud Project in Visual Studio and selected ASP.NET Web Role. Then create an ASP.NET WebForm application. Then right click on the cloud project and select "publish". In the publish dialog we need to make sure the application will be built in debug mode, since .NET assembly cannot be debugged in release mode. I enabled Remote Desktop as I will log into the virtual machine later in this post. It's NOT necessary for remote debug. And selected "advanced settings" tab, make sure we checked "Enable Remote Debugger for all roles". In WACS, a cloud service could be able to have one or more roles and each role could be able to have one or more instances. The remote debugger will be enabled for all roles and all instances if we checked. Currently there's no way for us to specify which role(s) and which instance(s) to enable. Finally click "publish" button. In the windows azure activity window in Visual Studio we can find some information about remote debugger. To attache remote process would be easy. Open the "server explorer" window in Visual Studio and expand "cloud services" node, find the cloud service, role and instance we had just published and wanted to debug, right click on the instance and select "attach debugger". Then after a while (it's based on how fast our Internet connect to Windows Azure Data Center) the Visual Studio will be switched to debug mode. Let's add a breakpoint in the default web page's form load function and refresh the page in browser to see what's happen. We can see that the our application was stopped at the breakpoint. The call stack, watch features are all available to use. Now let's hit F5 to continue the step, then back to the browser we will find the page was rendered successfully.   What Under the Hood Remote debugger is a WACS plugin. When we checked the "enable remote debugger" in the publish dialog, Visual Studio will add two cloud configuration settings in the CSCFG file. Since they were appended when deployment, we cannot find in our project's CSCFG file. But if we opened the publish package we could find as below. At the same time, Visual Studio will generate a certificate and included into the package for remote debugger. If we went to the azure management portal we will find there will a certificate under our application which was created, uploaded by remote debugger plugin. Since I enabled Remote Desktop there will be two certificates in the screenshot below. The other one is for remote debugger. When our application was deployed, windows azure system will open related ports for remote debugger. As below you can see there are two new ports opened on my application. Finally, in our WACS virtual machine, windows azure system will copy the remote debug component based on which version of Visual Studio we are using and start. Our application then can be debugged remotely through the visual studio remote debugger. Below is the task manager on the virtual machine of my WACS application.   Summary In this post I demonstrated one of the feature introduced in Windows Azure SDK 2.2, which is Remote Debugger. It allows us to attach our application from local machine to windows azure virtual machine once it had been deployed. Remote debugger is powerful and easy to use, but it brings more security risk. And since it's only available for debug build this means the performance will be worse than release build. Hence we should only use this feature for staging test and bug fix (publish our beta version to azure staging slot), rather than for production.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • A new SQL, a new Analysis Services, a new Workshop! #ssas #sql2012

    - by Marco Russo (SQLBI)
    One week ago Microsoft SQL Server 2012 finally debuted with a virtual launch event and you can find many intro sessions there (20 minutes each). There is a lot of new content available if you want to learn more about SQL 2012 and in this blog post I’d like to provide a few link to sessions, documents, bits and courses that are available now or very soon. First of all, the release of Analysis Services 2012 has finally released PowerPivot 2012 (many of us called it PowerPivot v2 before this official name) and also the new Data Mining Add-in for Microsoft Office 2010, now available also for Excel 64bit! And, of course, don’t miss the Microsoft SQL Server 2012 Feature Pack, there are a lot of upgrades for both DBAs and developers. I just discovered there is a new LocalDB version of SQL Express that can run in user mode without any setup. Is this the end of SQL CE? But now, back to Analysis Services: if you want some tutorial on Tabular, the Microsoft Virtual Academy has a whole track dedicated to Analysis Services 2012 but you will probably be interested also in the one about Reporting Services 2012. If you think that virtual is good but it’s not enough, there are plenty of conferences in the coming months – these are just those where I and Alberto will deliver some SSAS Tabular presentations: SQLBits X, London, March 29-31, 2012: if you are in London or want a good reason to go, this is the most important SQL Server event in Europe this year, no doubts about it. And not only because of the high number of attendees, but also because there is an impressive number of speakers (excluding me, of course) coming from all over the world. This is an event second only to PASS Summit in Seattle so there are no good reasons to not attend it. Microsoft SQL Server & Business Intelligence Conference 2012, Milan, March 28-29, 2012: this is an Italian conference so the language might be a barrier, but many of us also speak English and the food is good! Just a few seats still available. TechEd North America, Orlando, June 11-14, 2012: you know, this is a big event and it contains everything – if you want to spend a whole day learning the SSAS Tabular model with me and Alberto, don’t miss our pre-conference day “Using BISM Tabular in Microsoft SQL Server Analysis Services 2012” (be careful, it is on June 10, a nice study-Sunday!). TechEd Europe, Amsterdam, June 26-29, 2012: the European version of TechEd provides almost the same content and you don’t have to go overseas. We also run the same pre-conference day “Using BISM Tabular in Microsoft SQL Server Analysis Services 2012” (in this case, it is on June 25, that’s a regular Monday). I and Alberto will also speak at some user group meeting around Europe during… well, we’re going to travel a lot in the next months. In fact, if you want to get a complete training on SSAS Tabular, you should spend two days with us in one of our SSAS Tabular Workshop! We prepared a 2-day seminar, a very intense one, that start from the simple tabular modeling and cover architecture, DAX, query, advanced modeling, security, deployment, optimization, monitoring, relationships with PowerPivot and Multidimensional… Really, there are a lot of stuffs here! We announced the first dates in Europe and also an online edition optimized for America’s time zone: Apr 16-17, 2012 – Amsterdam, Netherlands Apr 26-27, 2012 – Copenhagen, Denmark May 7-8, 2012 – Online for America’s time zone May 14-15, 2012 – Brussels, Belgium May 21-22, 2012 – Oslo, Norway May 24-25, 2012 – Stockholm, Sweden May 28-29, 2012 – London, United Kingdom May 31-Jun 1, 2012 – Milan, Italy (Italian language) Also Chris Webb will join us in this workshop and in every date you can find who is the speaker on the web site. The course is based on our upcoming book, almost 600 pages (!) about SSAS Tabular, an incredible effort that will be available very soon in a preview (rough cuts from O’Reilly) and will be on the shelf in May. I will provide a link to order it as soon as we have one! And if you think that this is not enough… you’re right! Do you know what is the only thing you can do to optimize your Tabular model? Optimize your DAX code. Learning DAX is easy, mastering DAX requires some knowledge… and our DAX Advanced Workshop will provide exactly the required content. Public classes will be available later this year, by now we just deliver it on demand.

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #006

    - by pinaldave
    Here is the list of selected articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2006 This was my very first year of blogging so I was every day learning something new. As I have said many times, that blogging was never an intention. I had really not understood what exactly I am working on or beginning when I was beginning blogging in 2006. I had never knew that my life was going to change forever, once I started blogging. When I look back all of this year, I am happy that we are here together. 2007 IT Outsourcing to India – Top 10 Reasons Companies Outsource Outsourcing is about trust, collaboration and success. Helping other countries in need has been always the course of mankind, outsourcing is nothing different then that. With information technology and process improvements increasing the complexity, costs and skills required to accomplish routine tasks as well as challenging complex tasks, companies are outsourcing such tasks to providers who have the expertise to perform them at lower costs , with greater value and quality outcome. UDF – Remove Duplicate Chars From String This was a very interesting function I wrote in my early career. I am still using this function when I have to remove duplicate chars from strings. I have yet to come across a scenario where it does not work so I keep on using it till today. Please leave a comment if there is any better solution to this problem. FIX : Error : 3702 Cannot drop database because it is currently in use This is a very generic error when DROP Database is command is executed and the database is not dropped. The common mistake user is kept the connection open with this database and trying to drop the database. The database cannot be dropped if there is any other connection open along with it. It is always a good idea to take database in single user mode before dropping it. Here is the quick tutorial regarding how to bring the database in single user mode: Using T-SQL | Using SSMS. 2008 Install SQL Server 2008 – How to Upgrade to SQL Server 2008 – Installation Tutorial This was indeed one of the most popular articles in SQL Server 2008. Lots of people wanted to learn how to install SQL SErver 2008 but they were facing various issues while installation. I build this tutorial which becomes reference points for many. Default Collation of SQL Server 2008 What is the collation of SQL Server 2008 default installations? I often see this question confusing many experienced developers as well. Well the answer is in following image. Ahmedabad SQL Server User Group Meeting – November 2008 User group meetings are fun, now a days I am going to User Group meetings every week but there was a case when I have been just a beginner on this subject. The bug of the community was caught on me years ago when I started to present in Ahmedabad and Gandhinagar SQ LServer User Groups. 2009 Validate an XML document in TSQL using XSD My friend Jacob Sebastian wrote an excellent article on the subject XML and XSD. Because of the ‘eXtensible’ nature of XML (eXtensible Markup Language), often there is a requirement to restrict and validate the content of an XML document to a pre-defined structure and values. XSD (XML Schema Definition Language) is the W3C recommended language for describing and validating XML documents. SQL Server implements XSD as XML Schema Collections. Star Join Query Optimization At present, when queries are sent to very large databases, millions of rows are returned. Also the users have to go through extended query response times when joining multiple tables are involved with such queries. ‘Star Join Query Optimization’ is a new feature of SQL Server 2008 Enterprise Edition. This mechanism uses bitmap filtering for improving the performance of some types of queries by the effective retrieval of rows from fact tables. 2010 These puzzles are very interesting and intriguing – there was lots of interest on this subject. If you have free time this weekend. You may want to try them out. SQL SERVER – Challenge – Puzzle – Usage of FAST Hint (Solution)  SQL SERVER – Puzzle – Challenge – Error While Converting Money to Decimal (Solution)  SQL SERVER – Challenge – Puzzle – Why does RIGHT JOIN Exists (Open)  Additionally, I had great fun presenting SQL Server Performance Tuning seminar at fantastic locations in Hyderabad. Installing AdventeWorks Database This has been the most popular request I have received on my blog. Here is the quick video about how one can install AdventureWorks. 2011 Effect of SET NOCOUNT on @@ROWCOUNT There was an interesting incident once while I was presenting a session. I wrote a code and suddenly 10 hands went up in the air.  This was a bit surprise to me as I do not know why they all got alerted. I assumed that there should be something wrong with either project, screen or my display. However the real reason was very interesting – I suggest you read the complete blog post to understand this interesting scenario. Error: Deleting Offline Database and Creating the Same Name This is very interesting because once a user deletes the offline database the MDF and LDF file still exists and if the user attempts to create a new database with the same name it will give error. I found this very interesting and the blog explains the concept very quickly. Have you ever faced a similar situation? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • dpkg unsatisfied dependencies, now apt-get wants to remove whole system

    - by Bruno Finger
    firstly, I'm sorry for my terminal output in portuguese, but I guess it is still understandable. I am using Ubuntu GNOME 14.04 and I tried to update the GNOME Online Accounts packages by downloading the following .deb files from packages.ubuntu.com for the Ubuntu 14.10 version: libgoa-backend-1.0-dev_3.12.4-1_amd64.deb libgoa-backend-1.0-1_3.12.4-1_amd64.deb libgoa-1.0-dev_3.12.4-1_amd64.deb libgoa-1.0-0b_3.12.4-1_amd64.deb gnome-online-accounts_3.12.4-1_amd64.deb gir1.2-goa-1.0_3.12.4-1_amd64.deb After downloading them in the same folder, I run the command sudo dpkg -i *.deb, but it didn't install the packages, instead it showed errors due to packages which them depend doesn't meet the required version (and Ubuntu have no way to install them since they are not in this version's repositories). So now every time I want to install anything through apt-get, Ubuntu tells me to run apt-get -f install to fix the errors. This is the list of packages it needs to install/uninstall/update: $ sudo apt-get -f install Lendo listas de pacotes... Pronto Construindo árvore de dependências Lendo informação de estado... Pronto Corrigindo dependências... Pronto Os seguintes pacotes foram instalados automaticamente e já não são necessários: # THESE PACKAGES HAVE BEEN PREVIOUSLY INSTALLED AND ARE NO LONGER NECESSARY account-plugin-windows-live gir1.2-gweather-3.0 libatk-bridge2.0-dev libatk1.0-dev libcairo-script-interpreter2 libcairo2-dev libexpat1-dev libfontconfig1-dev libfreetype6-dev libgdk-pixbuf2.0-dev libglib2.0-dev libgtk-3-dev libharfbuzz-dev libharfbuzz-gobject0 libice-dev libpango1.0-dev libpcre3-dev libpcrecpp0 libpixman-1-dev libpng12-dev libpthread-stubs0-dev librest-dev libsm-dev libsoup2.4-dev libwayland-dev libx11-dev libx11-doc libxau-dev libxcb-render0-dev libxcb-shm0-dev libxcb1-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxft-dev libxi-dev libxinerama-dev libxkbcommon-dev libxml2-dev libxrandr-dev libxrender-dev pkg-config signon-plugin-password x11proto-composite-dev x11proto-core-dev x11proto-damage-dev x11proto-fixes-dev x11proto-input-dev x11proto-kb-dev x11proto-randr-dev x11proto-render-dev x11proto-xext-dev x11proto-xinerama-dev xorg-sgml-doctools xtrans-dev zlib1g-dev Utilize 'apt-get autoremove' para os remover. Os pacotes extra a seguir serão instalados: # THE FOLLOWING PACKAGES WILL BE INSTALLED debhelper dh-apparmor libatk-bridge2.0-dev libatk1.0-dev libcairo-script-interpreter2 libcairo2-dev libept1.4.12 libexpat1-dev libfontconfig1-dev libfreetype6-dev libgdk-pixbuf2.0-dev libglib2.0-dev libgtk-3-dev libharfbuzz-dev libharfbuzz-gobject0 libice-dev libmail-sendmail-perl libpango1.0-dev libpcre3-dev libpcrecpp0 libpixman-1-dev libpng12-dev libpthread-stubs0-dev librest-dev libsm-dev libsoup2.4-dev libwayland-dev libx11-dev libx11-doc libxau-dev libxcb-render0-dev libxcb-shm0-dev libxcb1-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxft-dev libxi-dev libxinerama-dev libxkbcommon-dev libxml2-dev libxrandr-dev libxrender-dev pkg-config po-debconf x11proto-composite-dev x11proto-core-dev x11proto-damage-dev x11proto-fixes-dev x11proto-input-dev x11proto-kb-dev x11proto-randr-dev x11proto-render-dev x11proto-xext-dev x11proto-xinerama-dev xorg-sgml-doctools xtrans-dev zlib1g-dev Pacotes sugeridos: dh-make apparmor-easyprof libcairo2-doc libglib2.0-doc libgtk-3-doc libice-doc libpango1.0-doc imagemagick libsm-doc libsoup2.4-doc libxcb-doc libxext-doc libmail-box-perl Os pacotes a seguir serão REMOVIDOS: # THE FOLLOWING PACKAGES WILL BE REMOVED account-plugin-aim account-plugin-jabber account-plugin-salut account-plugin-yahoo empathy evolution evolution-data-server evolution-data-server-online-accounts evolution-indicator evolution-plugins gdm gir1.2-gdata-0.0 gir1.2-goa-1.0 gir1.2-zpj-0.0 gnome-contacts gnome-control-center gnome-documents gnome-online-accounts gnome-online-miners gnome-shell gnome-shell-extension-weather gnome-shell-extensions grilo-plugins-0.2 gvfs-backends-goa libevolution libfolks-eds25 libgdata13 libgoa-1.0-0b libgoa-1.0-dev libgoa-backend-1.0-1 libgoa-backend-1.0-dev libzapojit-0.0-0 mcp-account-manager-uoa nautilus-sendto-empathy ubuntu-gnome-desktop Os NOVOS pacotes a seguir serão instalados: # THE NEW FOLLOWING PACKAGES WILL BE INSTALLED debhelper dh-apparmor libatk-bridge2.0-dev libatk1.0-dev libcairo-script-interpreter2 libcairo2-dev libept1.4.12 libexpat1-dev libfontconfig1-dev libfreetype6-dev libgdk-pixbuf2.0-dev libglib2.0-dev libgtk-3-dev libharfbuzz-dev libharfbuzz-gobject0 libice-dev libmail-sendmail-perl libpango1.0-dev libpcre3-dev libpcrecpp0 libpixman-1-dev libpng12-dev libpthread-stubs0-dev librest-dev libsm-dev libsoup2.4-dev libwayland-dev libx11-dev libx11-doc libxau-dev libxcb-render0-dev libxcb-shm0-dev libxcb1-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxft-dev libxi-dev libxinerama-dev libxkbcommon-dev libxml2-dev libxrandr-dev libxrender-dev pkg-config po-debconf x11proto-composite-dev x11proto-core-dev x11proto-damage-dev x11proto-fixes-dev x11proto-input-dev x11proto-kb-dev x11proto-randr-dev x11proto-render-dev x11proto-xext-dev x11proto-xinerama-dev xorg-sgml-doctools xtrans-dev zlib1g-dev 0 pacotes atualizados, 61 pacotes novos instalados, 35 a serem removidos e 22 não atualizados. 7 pacotes não totalmente instalados ou removidos. É preciso baixar 12,0 MB de arquivos. Depois desta operação, 25,0 MB adicionais de espaço em disco serão usados. Você quer continuar? [S/n] Along packages needed to be removed are even gdm. This is 100% sure to make the system useless. What can I do to fix this issue? I don't care if I can't install the new version of goa anymore.

    Read the article

  • So you want a French Site?

    - by juanlarios
    I thought I would write a quick write up of how to create a french site in SharePoint 2007. I'm not talking about a Variation but just a plain French Site from the ground up. There were some gotchas that I felt were worth blogging about. First:  go to Microsoft Telnet Article and follow the install instructions. Make sure that when you get to the download page that you select "French" as part of the drop down and you download and install the right language pack. I noticed that if you did not click the "change" button enven though I selected the 'french' language pack, it reverted back to the english language pack.   Second: You will notice a couple of things. When you go to central admin you will see the following:    Now you can pick between french site or english. You will get this if you install other language packs and they will be listed in the drop down. You will notice that you now have french headings and frech listings of sites. You see "Publishing" as a heading because I have a custom site definition that I deployed as a french site. Third: As you start navigating around and trying to create document libraries or sites you will start getting errors. Errors like the following: "Cannot make a cache safe URL for "SelectorControls.js", file not found. Please verify that the file exists under the layouts directory. " Troubleshoot issues with Windows SharePoint Services. Once you resolve the issue with this "js" file, you will find that there are other js files that are missing. The only problem is that if you are not fluent in French or the language you are trying to deploy, Well, you'll have a tough time understanding error messages as they will all be in the new language you are trying to deploy. So let's just talk about what happened when you installed the language pack. In the 12 Hive:  12/Template    you will now see a 1033 folder and a 1036 folder. The 1036 folder is the folder that was created and added as part of the language pack. What the above error is saying is that now that it's looking at the 1036 folder, well, it's missing some files. The nice thing is that these files are included in the 1033 folder (which is the English Language Pack). Simply copy and paste the controls from the one folder to the other. There will be more than one conflict so you will have to move serveral controls over. Can't remember how many but simply add them as error messages come up. I had to add some navigation controls and some content selectors.   Now that's all that you need to install the Frech Language pack anc reate site collections that are entirely in a another language. Do not mistake this with Variations, where you can have multiple language sites. For those of you doing a little bit extra with this, let me share what I was doing extra and what I needed to get it working for me. I had had a custom site definition which was obviously not showing up in my selection of french sites. I was under the impression that all sites in English would show up in french and that the sites were simply routed to a new Resource file for french content. And that is the case but there is a little extra that needs to be done if you have a custom site definition deployed:  First: Under hive 12/Template/1033/XML  there is a listing of site definition files that are deployed to the English side of things. If you navigate to 12/Template/1036/XML  and open one of the site definitions you will see that they are similar and reference the existing site definitions installed on the server, except that they have some french added to descriptions and names. Simply copy the xml file of your custom template to the 1036 folder to have it show up as a selection when you select French as the dropdown entry when create a site colleciton. You can go ahead and change the description and name to suit the language it's under.    Second: As part of my site definition, I packaed up several list templates, that were saved as STP files. When you navigate to the list template listing, well, the templates are for English sites, not French so I cannot create document libraries based on the template. What now? well here comes KWIzCom to the rescue! They seem to have put out a "STP language converter" where you can take a site template or list template and convert it to any target language you are after. It's a free download, Use it and you're good to go.  One thing I will mention is that when I convereted the English documents I whent ahead and converted them to French-Canadien. And it didn't work! so I finally figured out that the French Version it was expecting in the french site was "French-France". Don't know why that is, it's just what needs to be done to get that working. When I did that, I was able to use the List templates that I created in the English site for the French Site.   Hope it helps , good luck!

    Read the article

  • Justifiable Perks.

    - by Phil Factor
        I was once the director of a start-up IT Company, and had the task of recruiting a proportion of the management team. As my background was in IT management, I was rather more familiar with recruiting Geeks for technology jobs, but here, one of my early tasks was interviewing a Marketing Director.  The small group of financiers had suggested a rather strange Irishman called  Halleran.  From my background in City of London dealing-rooms, I was slightly unprepared for the experience of interviewing anyone wearing a pink suit. Many of my older City colleagues would have required resuscitation after seeing his white leather shoes. However, nobody will accuse me of prejudging an interviewee. After all, many Linux experts who I’ve come to rely on have appeared for interview dressed as hobbits. In fact, the interview went well, and we had even settled his salary.  I was somewhat unprepared for the coda.    ‘And I will need to be provided with a Ferrari  by the company.’    ‘Hmm. That seems reasonable.’    Initially, he looked startled, and then a slow smile of victory spread across his face.    ‘What colour would you like?’ I asked genially.    ‘It has to be red.’ He looked very earnest on this point.    ‘Fine. I have to go past Hamleys on the way home this evening, so I’ll pick one up then for you.’    ‘Er.. Hamley’s is a toyshop, not a Ferrari Dealership.’    I stared at him in bafflement for a few seconds. ‘You’re not seriously asking for a real Ferrari are you?’     ‘Well, yes. Not for my own sake, you understand. I’d much prefer a simple run-about, but my position demands it. How could I maintain the necessary status in the office without one? How could I do my job in marketing when my grey Datsun was all too visible in the car Park? It is a tool of the job.’    ‘Excuse me a moment, but I must confer with the MD’    I popped out to see Chris, the MD. ‘Chris, I’m interviewing a lunatic in a pink suit who is trying to demand that a Ferrari is a precondition of his employment. I tried the ‘misunderstanding trick’ but it didn’t faze him.’     ‘Sorry, Phil, but we’ve got to hire him. The VCs insist on it. You’ve got to think of something that doesn’t involve committing to the purchase of a Ferrari. Current funding barely covers the rent for the building.’    ‘OK boss. Leave it to me.’    On return, I slapped O’Halleran’s file on the table with a genial, paternalistic smile. ‘Of course you should have a Ferrari. The only trouble is that it will require a justification document that can be presented to the board. I’m sure you’ll have no problem in preparing this document in the required format.’ The initial look of despair was quickly followed by a bland look of acquiescence. He had, earlier in the interview, argued with great eloquence his skill in preparing the tiresome documents that underpin the essential corporate and government deals that were vital to the success of this new enterprise. The justification of a Ferrari should be a doddle.     After the interview, Chris nervously asked how I’d fared.     ‘I think it is all solved.’    ‘… without promising a Ferrari, I hope.’    ‘Well, I did actually; on condition he justified it in writing.’    Chris issued a stream of invective. The strain of juggling the resources in an underfunded startup was beginning to show.    ‘Don’t worry. In the unlikely event of him coming back with the required document, I’ll give him mine.’    ‘Yours?’ He strode over to the window to stare down at the car park.    He needn’t have worried: I knew that his breed of marketing man could more easily lay an ostrich egg than to prepare a decent justification document. My Ferrari is still there at the back of my garage. Few know of the Ferrari cultivator, a simple inexpensive motorized device designed for the subsistence farmers of southern Italy. It is the very devil to start, but it creates a perfect tilth for the seedbed.

    Read the article

  • A new SQL, a new Analysis Services, a new Workshop! #ssas #sql2012

    - by Marco Russo (SQLBI)
    One week ago Microsoft SQL Server 2012 finally debuted with a virtual launch event and you can find many intro sessions there (20 minutes each). There is a lot of new content available if you want to learn more about SQL 2012 and in this blog post I’d like to provide a few link to sessions, documents, bits and courses that are available now or very soon. First of all, the release of Analysis Services 2012 has finally released PowerPivot 2012 (many of us called it PowerPivot v2 before this official name) and also the new Data Mining Add-in for Microsoft Office 2010, now available also for Excel 64bit! And, of course, don’t miss the Microsoft SQL Server 2012 Feature Pack, there are a lot of upgrades for both DBAs and developers. I just discovered there is a new LocalDB version of SQL Express that can run in user mode without any setup. Is this the end of SQL CE? But now, back to Analysis Services: if you want some tutorial on Tabular, the Microsoft Virtual Academy has a whole track dedicated to Analysis Services 2012 but you will probably be interested also in the one about Reporting Services 2012. If you think that virtual is good but it’s not enough, there are plenty of conferences in the coming months – these are just those where I and Alberto will deliver some SSAS Tabular presentations: SQLBits X, London, March 29-31, 2012: if you are in London or want a good reason to go, this is the most important SQL Server event in Europe this year, no doubts about it. And not only because of the high number of attendees, but also because there is an impressive number of speakers (excluding me, of course) coming from all over the world. This is an event second only to PASS Summit in Seattle so there are no good reasons to not attend it. Microsoft SQL Server & Business Intelligence Conference 2012, Milan, March 28-29, 2012: this is an Italian conference so the language might be a barrier, but many of us also speak English and the food is good! Just a few seats still available. TechEd North America, Orlando, June 11-14, 2012: you know, this is a big event and it contains everything – if you want to spend a whole day learning the SSAS Tabular model with me and Alberto, don’t miss our pre-conference day “Using BISM Tabular in Microsoft SQL Server Analysis Services 2012” (be careful, it is on June 10, a nice study-Sunday!). TechEd Europe, Amsterdam, June 26-29, 2012: the European version of TechEd provides almost the same content and you don’t have to go overseas. We also run the same pre-conference day “Using BISM Tabular in Microsoft SQL Server Analysis Services 2012” (in this case, it is on June 25, that’s a regular Monday). I and Alberto will also speak at some user group meeting around Europe during… well, we’re going to travel a lot in the next months. In fact, if you want to get a complete training on SSAS Tabular, you should spend two days with us in one of our SSAS Tabular Workshop! We prepared a 2-day seminar, a very intense one, that start from the simple tabular modeling and cover architecture, DAX, query, advanced modeling, security, deployment, optimization, monitoring, relationships with PowerPivot and Multidimensional… Really, there are a lot of stuffs here! We announced the first dates in Europe and also an online edition optimized for America’s time zone: Apr 16-17, 2012 – Amsterdam, Netherlands Apr 26-27, 2012 – Copenhagen, Denmark May 7-8, 2012 – Online for America’s time zone May 14-15, 2012 – Brussels, Belgium May 21-22, 2012 – Oslo, Norway May 24-25, 2012 – Stockholm, Sweden May 28-29, 2012 – London, United Kingdom May 31-Jun 1, 2012 – Milan, Italy (Italian language) Also Chris Webb will join us in this workshop and in every date you can find who is the speaker on the web site. The course is based on our upcoming book, almost 600 pages (!) about SSAS Tabular, an incredible effort that will be available very soon in a preview (rough cuts from O’Reilly) and will be on the shelf in May. I will provide a link to order it as soon as we have one! And if you think that this is not enough… you’re right! Do you know what is the only thing you can do to optimize your Tabular model? Optimize your DAX code. Learning DAX is easy, mastering DAX requires some knowledge… and our DAX Advanced Workshop will provide exactly the required content. Public classes will be available later this year, by now we just deliver it on demand.

    Read the article

  • Part 1 - 12c Database and WLS - Overview

    - by Steve Felts
    The download of Oracle 12c database became available on June 25, 2013.  There are some big new features in 12c database and WebLogic Server will take advantage of them. Immediately, we will support using 12c database and drivers with WLS 10.3.6 and 12.1.1.  When the next version of WLS ships, additional functionality will be supported (those rows in the table below with all "No" values will get a "Yes).  The following table maps the Oracle 12c Database features supported with various combinations of currently available WLS releases, 11g and 12c Drivers, and 11g and 12c Databases. Feature WebLogic Server 10.3.6/12.1.1 with 11g drivers and 11gR2 DB WebLogic Server 10.3.6/12.1.1 with 11g drivers and 12c DB WebLogic Server 10.3.6/12.1.1 with 12c drivers and 11gR2 DB WebLogic Server 10.3.6/12.1.1 with 12c drivers and 12c DB JDBC replay No No No Yes (Active GridLink only in 10.3.6, add generic in 12.1.1) Multi Tenant Database No Yes (except set container) No Yes (except set container) Dynamic switching between Tenants No No No No Database Resident Connection pooling (DRCP) No No No No Oracle Notification Service (ONS) auto configuration No No No No Global Database Services (GDS) No Yes (Active GridLink only) No Yes (Active GridLink only) JDBC 4.1 (using ojdbc7.jar files & JDK 7) No No Yes Yes  The My Oracle Support (MOS) document covering this is "WebLogic Server 12.1.1 and 10.3.6 Support for Oracle 12c Database [ID 1564509.1]" at the link https://support.oracle.com/epmos/faces/DocumentDisplay?id=1564509.1. The following documents are also key references:12c Oracle Database Developer Guide http://docs.oracle.com/cd/E16655_01/appdev.121/e17620/toc.htm 12c Oracle Database Administrator's Guide http://docs.oracle.com/cd/E16655_01/server.121/e17636/toc.htm . I plan to write some related blog articles not to duplicate existing product documentation but to introduce the features, provide some examples, and tie together some information to make it easier to understand. How do you get started with 12c?  The easiest way is to point your data source at a 12c database.  The only change on the WLS side is to update the URL in your data source (assuming that you are not just upgrading your database).  You can continue to use the 11.2.0.3 driver jar files that shipped with WLS 10.3.6 or 12.1.1.  You shouldn't see any changes in your application.  You can take advantage of enhancements on the database side that don't affect the mid-tier.  On the WLS side, you can take advantage of using Global Data Service or connecting to a tenant in a multi-tenant database transparently. If you want to use the 12c client jar files, it's a bit of work because they aren't shipped with WLS and you can't just drop in ojdbc6.jar as in the old days.  You need to use a matched set of jar files and they need to come before existing jar files in the CLASSPATH.  The MOS article is written from the standpoint that you need to get the jar files directly - download almost 1G and install over 600M footprint to get 15 jar files.  Assuming that you have the database installed and you can get access to the installation (or ask the DBA), you need to copy the 15 jar files to each machine with a WLS installation and get them in your CLASSPATH.  You can play with setting the PRE_CLASSPATH but the more practical approach may be to just update WL_HOME/common/bin/commEnv.sh directly.  There's a change in the transaction completion behavior (read the MOS) so if you think you might run into that, you will want to set -Doracle.jdbc.autoCommitSpecCompliant=false.  Also if you are running with Active GridLink, you must set -Doracle.ucp.PreWLS1212Compatible=true (how's that for telling you that this is fixed in WLS 12.1.2).  Once you get the configuration out of the way, you can start using the new ojdbc7.jar in place of the ojdbc6.jar to get the new JDBC 4.1 API's.  You can also start using Application Continuity.  This feature is also known as JDBC Replay because when a connection fails you get a new one with all JDBC operations up to the failure point automatically replayed.  As you might expect, there are some limitations but it's an interesting feature.  Obviously I'm going to focus on the 12c database features that we can leverage in WLS data source.  You will need to read other sources or the product documentation to get all of the new features.

    Read the article

  • strange sqares like hints in Silverlight application?

    - by lina
    Good day! Strange square appears on mouse hover on text boxes, buttons, etc (something like hint) in a silverlight navigation application - how can I remove it? a scrin shot an example .xaml page: <Code:BasePage x:Class="CAP.Views.Main" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d" xmlns:navigation="clr-namespace:System.Windows.Controls;assembly=System.Windows.Controls.Navigation" xmlns:Code="clr-namespace:CAP.Code" d:DesignWidth="640" d:DesignHeight="480" Title="?????? ??????? ???????? ??? ?????"> <Grid x:Name="LayoutRoot"> <Grid.RowDefinitions> <RowDefinition Height="103*" /> <RowDefinition Height="377*" /> </Grid.RowDefinitions> <Grid.ColumnDefinitions> <ColumnDefinition Width="120*" /> <ColumnDefinition Width="520*" /> </Grid.ColumnDefinitions> <Image Height="85" HorizontalAlignment="Left" Name="image1" Stretch="Fill" VerticalAlignment="Top" Width="84" Margin="12,0,0,0" ImageFailed="image1_ImageFailed" Source="/CAP;component/Images/My-Computer.png" /> <TextBlock Grid.Column="1" Height="Auto" TextWrapping="Wrap" HorizontalAlignment="Left" Margin="0,12,0,0" Name="textBlock1" Text="Good day!" VerticalAlignment="Top" FontFamily="Verdana" FontSize="16" Width="345" FontWeight="Bold" /> <TextBlock Grid.Column="1" Grid.Row="1" TextWrapping="Wrap" Height="299" HorizontalAlignment="Left" Name="textBlock2" VerticalAlignment="Top" FontFamily="Verdana" FontSize="14" Width="441" > <Run Text="Some text "/><LineBreak/><LineBreak/><Run Text="and so on"/> <LineBreak/> </TextBlock> </Grid> xaml.cs: using System; using System.Collections.Generic; using System.Linq; using System.Net; using System.Windows; using System.Windows.Controls; using System.Windows.Documents; using System.Windows.Input; using System.Windows.Media; using System.Windows.Media.Animation; using System.Windows.Shapes; using System.Windows.Navigation; using CAP.Code; namespace CAP.Views { public partial class Main : BasePage { public Main() : base() { InitializeComponent(); MapBuilder.AddToMap(new SiteMapUnit() { Caption = "???????", RelativeUrl = "Main" },true); ((App)Application.Current).Mainpage.tvMainMenu.SelectedItems.Clear(); } // Executes when the user navigates to this page. protected override void OnNavigatedTo(NavigationEventArgs e) { } private void image1_ImageFailed(object sender, ExceptionRoutedEventArgs e) { } protected override string[] NeededPermission() { return new string[0]; } } } MainPage.xaml <UserControl x:Class="CAP.MainPage" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:Code="clr-namespace:CAP.Code" xmlns:navigation="clr-namespace:System.Windows.Controls;assembly=System.Windows.Controls.Navigation" xmlns:uriMapper="clr-namespace:System.Windows.Navigation;assembly=System.Windows.Controls.Navigation" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:telerik="clr-namespace:Telerik.Windows.Controls;assembly=Telerik.Windows.Controls" xmlns:telerikNavigation="clr-namespace:Telerik.Windows.Controls;assembly=Telerik.Windows.Controls.Navigation" mc:Ignorable="d" Margin="0,0,0,0" Width="auto" Height="auto" xmlns:dataInput="clr-namespace:System.Windows.Controls;assembly=System.Windows.Controls.Data.Input"> <ScrollViewer Width="auto" Height="auto" BorderBrush="White" BorderThickness="0" Margin="0,0,0,0" x:Name="sV" HorizontalScrollBarVisibility="Auto" VerticalScrollBarVisibility="Auto" > <ScrollViewer.Content> <Grid Width="auto" Height="auto" x:Name="LayoutRoot" Style="{StaticResource LayoutRootGridStyle}" Margin="0,0,0,0"> <StackPanel Width="auto" Height="auto" Orientation="Vertical" Margin="250,0,0,50"> <Border x:Name="ContentBorder2" Margin="0,0,0,0" > <!--<navigation:Frame Margin="0,0,0,0" Width="auto" Height="auto" x:Name="AnotherFrame" VerticalAlignment="Top" Style="{StaticResource ContentFrameStyle}" Source="/Views/Menu.xaml" NavigationFailed="ContentFrame_NavigationFailed" JournalOwnership="OwnsJournal" Loaded="AnotherFrame_Loaded"> </navigation:Frame>--> <StackPanel Orientation="Vertical" Height="82" Width="Auto" HorizontalAlignment="Right" Margin="0,0,0,0" DataContext="{Binding}"> <TextBlock HorizontalAlignment="Right" Foreground="White" x:Name="ApplicationNameTextBlock4" Style="{StaticResource ApplicationNameStyle}" FontSize="20" Text="?????? ???????" Margin="20,16,20,0"/> <StackPanel Orientation="Horizontal" HorizontalAlignment="Right"> <Image x:Name="imDoor" Visibility="Collapsed" MouseEnter="imDoor_MouseEnter" MouseLeave="imDoor_MouseLeave" Height="24" Stretch="Fill" Width="25" Margin="10,0,10,0" Source="/CAP;component/Images/sm_white_doors.png" MouseLeftButtonDown="bTest_Click" /> <TextBlock x:Name="bLogout" MouseEnter="bLogout_MouseEnter" MouseLeave="bLogout_MouseLeave" TextDecorations="Underline" Margin="0,6,20,4" Height="23" Text="?????" HorizontalAlignment="Right" Visibility="Collapsed" MouseLeftButtonDown="bTest_Click" FontFamily="Verdana" FontSize="13" FontWeight="Normal" Foreground="#FF1C1C92" /> </StackPanel> </StackPanel> </Border> <Border x:Name="bSiteMap" Margin="0,0,0,0" > <StackPanel x:Name="spSiteMap" Orientation="Horizontal" Height="20" Width="Auto" HorizontalAlignment="Left" Margin="0,0,0,0" DataContext="{Binding}"> <!-- <TextBlock Visibility="Visible" TextDecorations="Underline" Height="23" HorizontalAlignment="Left" x:Name="ar" Text="1" VerticalAlignment="Top" Foreground="Blue" FontFamily="Verdana" FontSize="13" /> <TextBlock Visibility="Visible" Height="23" HorizontalAlignment="Left" x:Name="Map" Text="->" VerticalAlignment="Top" Foreground="Blue" FontFamily="Verdana" FontSize="13" /> <TextBlock Visibility="Visible" TextDecorations="Underline" Height="23" HorizontalAlignment="Left" x:Name="ar1" Text="2" VerticalAlignment="Top" Foreground="Blue" FontFamily="Verdana" FontSize="13" /> <TextBlock Visibility="Visible" Height="23" HorizontalAlignment="Left" x:Name="Map1" Text="->" VerticalAlignment="Top" Foreground="Blue" FontFamily="Verdana" FontSize="13" /> <TextBlock Visibility="Visible" TextDecorations="Underline" Height="23" HorizontalAlignment="Left" x:Name="ar2" Text="3" VerticalAlignment="Top" Foreground="Blue" FontFamily="Verdana" FontSize="13" />--> </StackPanel> </Border> <Border Width="auto" Height="auto" x:Name="ContentBorder" Margin="0,0,0,0" > <navigation:Frame x:Name="ContentFrame" Style="{StaticResource ContentFrameStyle}" Source="Main" Navigated="ContentFrame_Navigated" NavigationFailed="ContentFrame_NavigationFailed" ToolTipService.ToolTip=" " Margin="0,0,0,0"> <navigation:Frame.UriMapper> <uriMapper:UriMapper> <!--Client--> <uriMapper:UriMapping Uri="RegistrateClient" MappedUri="/Views/Client/RegistrateClient.xaml"/> <!--So on--> </uriMapper:UriMapper> </navigation:Frame.UriMapper> </navigation:Frame> </Border> </StackPanel> <Grid x:Name="NavigationGrid" Style="{StaticResource NavigationGridStyle}" Margin="0,0,0,0" Background="{x:Null}" > <StackPanel Orientation="Vertical" Height="Auto" Width="250" HorizontalAlignment="Center" Margin="0,0,0,50" DataContext="{Binding}"> <Image Width="150" Height="90" HorizontalAlignment="Center" VerticalAlignment="Top" Source="/CAP;component/Images/logo__au.png" Margin="0,20,0,70"/> <Border x:Name="BrandingBorder" MinHeight="222" Width="250" Style="{StaticResource BrandingBorderStyle3}" HorizontalAlignment="Center" Opacity="60" Margin="0,0,0,0"> <Border.Background> <ImageBrush ImageSource="/CAP;component/Images/papka.png"/> </Border.Background> <Grid Width="250" x:Name="LichniyCabinet" Margin="0,10,0,0" HorizontalAlignment="Center" Height="211"> <Grid.ColumnDefinitions> <ColumnDefinition Width="19*" /> <ColumnDefinition Width="62*" /> <ColumnDefinition Width="151*" /> <ColumnDefinition Width="18*" /> </Grid.ColumnDefinitions> <Grid.RowDefinitions> <RowDefinition Height="13" /> <RowDefinition Height="24" /> <RowDefinition Height="35" /> <RowDefinition Height="35" /> <RowDefinition Height="43" /> <RowDefinition Height="28" /> <RowDefinition Height="32*" /> </Grid.RowDefinitions> <TextBlock Visibility="Visible" Grid.Row="2" Height="23" HorizontalAlignment="Left" x:Name="tLogin" Text="?????" VerticalAlignment="Top" FontFamily="Verdana" FontSize="13" Foreground="White" Margin="1,0,0,0" Grid.Column="1" /> <TextBlock Visibility="Visible" FontFamily="Verdana" FontSize="13" Foreground="White" Height="23" HorizontalAlignment="Left" x:Name="tPassw" Text="??????" VerticalAlignment="Top" Grid.Row="3" Grid.Column="1" /> <TextBox Visibility="Visible" Grid.Column="2" Grid.Row="2" Height="24" HorizontalAlignment="Left" x:Name="logLogin" VerticalAlignment="Top" Width="150" /> <PasswordBox Visibility="Visible" Code:DefaultButtonService.DefaultButton="{Binding ElementName=bLogin}" PasswordChar="*" Height="24" HorizontalAlignment="Left" x:Name="logPassword" VerticalAlignment="Top" Width="150" Grid.Column="2" Grid.Row="3" /> <Button x:Name="bLogin" MouseEnter="bLogin_MouseEnter" MouseLeave="bLogin_MouseLeave" Visibility="Visible" Content="?????" Grid.Column="2" Grid.Row="4" Click="Button_Click" Height="23" HorizontalAlignment="Left" Margin="81,0,0,0" VerticalAlignment="Top" Width="70" /> <TextBlock MouseLeftButtonDown="ForgotPassword_MouseLeftButtonDown" MouseEnter="ForgotPassword_MouseEnter" MouseLeave="ForgotPassword_MouseLeave" Visibility="Visible" TextDecorations="Underline" Grid.ColumnSpan="2" Grid.Row="4" Height="23" HorizontalAlignment="Left" x:Name="ForgotPassword" Text="?????? ???????" VerticalAlignment="Top" Foreground="White" FontFamily="Verdana" FontSize="13" Grid.Column="1" /> <TextBlock MouseEnter="tbRegistration_MouseEnter" MouseLeave="tbRegistration_MouseLeave" MouseLeftButtonDown="tbRegistration_MouseLeftButtonDown" Grid.Column="2" Grid.Row="6" Height="23" x:Name="tbRegistration" TextDecorations="Underline" Text="???????????" VerticalAlignment="Top" FontFamily="Verdana" FontSize="13" TextAlignment="Center" HorizontalAlignment="Center" Foreground="#FF1C1C92" FontWeight="Normal" Margin="0,0,57,0" /> <TextBlock Cursor="Arrow" Height="23" HorizontalAlignment="Left" Margin="11,-3,0,0" Text="?????? ???????" VerticalAlignment="Top" Grid.ColumnSpan="3" Grid.RowSpan="2" FontFamily="Verdana" FontSize="13" FontWeight="Bold" Foreground="White" /> <Image Visibility="Collapsed" Height="70" x:Name="imUser" Stretch="Fill" Width="70" Grid.ColumnSpan="2" Margin="11,0,0,0" Grid.Row="2" Grid.RowSpan="2" Source="/CAP;component/Images/user2.png" /> <TextBlock x:Name="tbHello" Grid.Column="2" Visibility="Collapsed" Grid.Row="2" Height="auto" TextWrapping="Wrap" HorizontalAlignment="Left" Margin="6,0,0,0" Text="" VerticalAlignment="Top" FontFamily="Verdana" FontSize="13" Foreground="White" Width="145" /> </Grid> </Border> <Border x:Name="MenuBorder" Margin="0,0,0,50" Width="250" Visibility="Collapsed"> <StackPanel x:Name="spMenu" Width="240" HorizontalAlignment="Left"> <telerikNavigation:RadTreeView x:Name="tvMainMenu" Width="240" Selected="TreeView1_Selected" SelectedValuePath="Text" telerik:Theming.Theme="Windows7" FontFamily="Verdana" FontSize="12"/> </StackPanel> </Border> </StackPanel> </Grid> <Border x:Name="FooterBorder" VerticalAlignment="Bottom" Width="auto" Height="76"> <Border.Background> <ImageBrush ImageSource="/CAP;component/Images/footer2.png" /> </Border.Background> <TextBlock x:Name="tbFooter" Height="24" Width="auto" Margin="0,20,0,0" TextAlignment="Center" HorizontalAlignment="Stretch" VerticalAlignment="Center" Foreground="White" FontFamily="Verdana" FontSize="11"> </TextBlock> </Border> </Grid> </ScrollViewer.Content> </ScrollViewer> </UserControl> MainPage.xaml.cs using System; using System.Collections.Generic; using System.Linq; using System.Windows; using System.Windows.Controls; using System.Windows.Documents; using System.Windows.Navigation; using CAP.Code; using CAP.Registrator; using System.Windows.Input; using System.ComponentModel.DataAnnotations; using System.Windows.Browser; using Telerik.Windows.Controls; using System.Net; using System.Windows.Media; using System.Windows.Media.Animation; using System.Windows.Navigation; using System.Windows.Shapes; namespace CAP { public partial class MainPage { public App Appvars = Application.Current as App; private readonly RegistratorClient registrator; public SiteMapBuilder builder; public MainPage() { InitializeComponent(); sV.SetIsMouseWheelScrollingEnabled(true); builder = new SiteMapBuilder(spSiteMap); try { //working with service } catch { this.ContentFrame.Navigate(new Uri(String.Format("ErrorPage"), UriKind.RelativeOrAbsolute)); } } /// Recursive method to update the correct scrollviewer (if exists) private ScrollViewer CheckParent(FrameworkElement element) { ScrollViewer _result = element as ScrollViewer; if (element != null && _result == null) { FrameworkElement _temp = element.Parent as FrameworkElement; _result = CheckParent(_temp); } return _result; } // If an error occurs during navigation, show an error window private void ContentFrame_NavigationFailed(object sender, NavigationFailedEventArgs e) { e.Handled = true; ChildWindow errorWin = new ErrorWindow(e.Uri); errorWin.Show(); } } }

    Read the article

  • Delving into design patterns, and what that means for the Oracle user experience

    - by Kathy.Miedema
    By Kathy Miedema, Oracle Applications User Experience George Hackman, Senior Director, Applications User Experiences The Oracle Applications User Experience team has some exciting things happening around Fusion Applications design patterns. Because we’re hoping to have some new offerings soon (stay tuned with VoX to see what’s in the pipeline around Fusion Applications design patterns), now is a good time to talk more about what design patterns can do for the individual user as well as the entire company. George Hackman, Senior Director of Operations User Experience, says the first thing to note is that user experience is not just about the user interface. It’s about understanding how people do things, observing them, and then finding the patterns that emerge. The Applications UX team develops those patterns and then builds them into Oracle applications. What emerges, Hackman says, is a consistent, efficient user experience that promotes a productive workplace. Creating design patterns What is a design pattern in the context of enterprise software? “Every day, people use technology to get things done,” Hackman says. “They navigate a virtual world that reaches from enterprise to consumer apps, and from desktop to mobile. This virtual world is constantly under construction. New areas are being developed and old areas are being redone. As this world is being built and remodeled, efficient pathways and practices emerge. “Oracle's user experience team watches users navigate this world. We measure their productivity and ask them about their satisfaction. We take the most efficient, most productive pathways from the enterprise and consumer world and turn them into Oracle's user experience patterns.” Hackman describes the process as combining all of the best practices from every part of a user’s world. Members of the user experience team observe, analyze, design, prototype, and measure each work task to find the best possible pattern for a particular work flow. As the team builds the patterns, “we make sure they are fully buildable using Oracle technology,” Hackman said. “So customers know they can use these patterns. There’s no need to make something up from scratch, not knowing whether you can even build it.” Hackman says that creating something on a computer is a good example of a user experience pattern. “People are creating things all the time,” he says. “On the consumer side, they are creating documents. On the enterprise side, they are creating expense reports. On a mobile phone, they are creating contacts. They are using different apps like iPhone or Facebook or Gmail or Oracle software, all doing this creation process.” The Applications UX team starts their process by observing how people might create something. “We observe people creating things. We see the patterns, we analyze and document, then we apply them to our products. It might be different from phone to web browser, but we have these design patterns that create a consistent experience across platforms, and across products, too. The result for customers Oracle constantly improves its part of the virtual world, Hackman said. New products are created and existing products are upgraded. Because Oracle builds user experience design patterns, Oracle's virtual world becomes both more powerful and more familiar at the same time. Because of design patterns, users can navigate with ease as they embrace the latest technology – because it behaves the way they expect it to. This means less training and faster adoption for individual users, and more productivity for the business as a whole. Hackman said Oracle gives customers and partners access to design patterns so that they can build in the virtual world using the same best practices. Customers and partners can extend applications with a user experience that is comfortable and familiar to their users. For businesses that are integrating different Oracle applications, design patterns are key. The user experience created in E-Business Suite should be similar to the user experience in Fusion Applications, Hackman said. If a user is transitioning from one application to the other, it shouldn’t be difficult for them to do their work. With design patterns, it isn’t. “Oracle user experience patterns are the building blocks for the virtual world that ensure productivity, consistency and user satisfaction,” Hackman said. “They are built for the enterprise, but incorporate the best practices from across the virtual world. They empower productivity and facilitate social interaction. When you build with patterns, you get all the end-user benefits of less training / retraining from the finished product. You also get faster / cheaper development.” What’s coming? You can already access design patterns to help you build Dashboards with OBIEE here. And we promised you at the beginning that we had something in the pipeline on Fusion Applications design patterns. Look for the announcement about when they are available here on VoX.

    Read the article

  • Open World Day 3

    - by Antony Reynolds
    A Day in the Life of an Oracle OpenWorld Attendee Part IV My third day was exhibition day for me!  I took the opportunity to wander around the JavaOne and OpenWorld exhibitions to see what might be useful for me when selling WebLogic, Coherence & SOA Suite.  I found a number of interesting vendors and thought I would share what I found here.  These are not necessarily endorsements, but observations on companies that I thought had interesting looking products that fill a need I have seen at customers. Highly Available EBS Upgrades A few years ago I worked with a customer that was a port authority.  They wanted to tie E-Business Suite into their operations to provide faster processing of cargo and passengers.  However they only had a 2 hour downtime window to perform upgrades.  This was not a problem for core database and middleware technology, this could accommodate those upgrade timescales easily.  It was a problem for EBS however so I intrigued to find Rapid E-Suite Inc offering an 11i to 12i upgrade service that claims to require no outage.  This could be a real boon to EBS customers like my port friends that need to upgrade without disruption to their business. Mobile on WebLogic I have come across a number of customers who want a comprehensive mobile solution, connected and disconnected operation and so forth.  ADF only addresses part of these requirements currently so I was excited to discover mFrontiers Inc offering an apparently comprehensive solution that should integrate easily with Oracle SOA Suite to mobile enable a SOA infrastructure.  The ability to operate without a network is important for many applications, particularly in industries that require their engineers to enter buildings to perform maintenance or repairs, because network access is not always available – many of my colleagues don’t have mobile access from their homes because they live in the middle of nowhere – and disconnected support is crucial in these situations. Sharepoint Connector for WebCenter Content Obviously Sharepoint is an evil pernicious intrusion into a companies IT estate but it is widely deployed and many people like it but also would like to take advantage of Oracle products such as WebCenter Content.  So I was encouraged to see that Fishbowl Solutions have created a connector for Sharepoint that allows it to bring in content from WebCenter, it looks like a valuable way to maintain the Sharepoint interface end users are used to but extend the range of content by pulling stuff (technical term for content) from WebCenter.   Load Balancing The Enterprise Deployment Guides are Oracles bible on building highly available FMW environments, and each of them requires a front end load balancer.  I have been asked to help configure F5 Load Balancers on a number of occasions over my time at Oracle and each time I come back to it I find more useful features have been added to the BigIP line of load balancers that F5 sell, many of their documents are tailored to FMW.  I like F5, they provide (relatively) easy to use products that do what they say on the side of the box.  They may not have all the bells and whistles of some of their more expensive competitors but they do the job and do it well!  Besides which I like their logo! Other Stuff I saw lots of other interesting products and services, such as a lightweight monitoring tool for Coherence, Forms migration services, JCAPS migration services and lots of cool freebies to take home to the children! A Quiet Night Wednesday night was the partner appreciation event and I had decided to go back to the hotel and have an early night.  I decided to attend the last session of the day – a Maven/Hudson/WebLogic tutorial.  I got the wrong hotel for the session and snuck in 20 minutes late at the back and starting working on the hands on workshop.  One of my co-attendees raised his hand for help and as the presenter came over to help he suddenly stopped and yelled – “Is that Antony”!  It was my old friend Steve Button who used to be based in Redwood Shores but is now a WebLogic guru PM in Australia.  It was good to catch up with him.  As he yelled out a guy with really bad posture turned around to see who he was talking to, this turned out to be my friend Simon Haslan, Oracle ACE from the UK.  After the tutorial Simon and I retired to the coffee shop to catch up and share stories.  2 and half hours later we decided it was time to retire, so much for an early night but great to renew old friendships and find out what real customers are worrying about.

    Read the article

  • The standards that fail us and the intellectual bubble

    - by Jeff
    There has been a great deal of noise in the techie community about standards, and a sudden and unexplainable hate for Flash. This noise isn't coming from consumers... the countless soccer moms, teens and your weird uncle Bob, it's coming from the people who build (or at least claim to build) the stuff those consumers consume. If you could survey the position of consumers on the topic, they'd likely tell you that they just want stuff on the Web to work.The noise goes something like this: Web standards are the correct and right thing to use across the Intertubes, and anything not a part of those standards (Flash) is bad. Furthermore, the more recent noise is centered around the idea that HTML 5, along with Javascript, is the right thing to use. The arguments against Flash are, well, the truth is I haven't seen a good argument. I see anecdotal nonsense about high CPU usage and things I'd never think to check when I'm watching Piano Cat on YouTube, but these aren't arguments to me. Sure, I've seen it crash a browser a few times, but it's totally rare.But let's go back to standards. Yes, standards have played an important role in establishing the ubiquity of the Web. The protocols themselves, TCP/IP and HTTP, have been critical. HTML, which has served us well for a very long time, established an incredible foundation. Javascript did an OK job, and thanks to clever programmers writing great frameworks like JQuery, is becoming more and more useful. CSS is awful (there, I said it, I feel SO much better), and I'll never understand why it's so disconnected and different from anything else. It doesn't help that it's so widely misinterpreted by different browsers. Still, there's no question that standards are a good thing, and they've been good for the Web, consumers and publishers alike.HTML 4 has been with us for more than a decade. In Web years, that might as well be 80. HTML 5, contrary to popular belief, is not a standard, and likely won't be for many years to come. In fact, the Web hasn't really evolved at all in terms of its standards. The tools that generate the standard markup and script have, but at the end of the day, we're still living with standards that are more than ten years old. The "official" standards process has failed us.The Web evolved anyway, and did not wait for standards bodies to decide what to do next. It evolved in part because Macromedia, then Adobe, kept evolving Flash. In the earlier days, it mostly just did obnoxious splash pages, but then it started doing animation, and then rich apps as they added form input. Eventually it found its killer app: video. Now more than 95% of browsers have Flash installed. Consumers are better for it.But I'll do it one better... I'll go out on a limb and say that Flash is a standard. If it's that pervasive, I don't care what you tell me, it's a standard. Just because a company owns it doesn't mean that it's evil or not a standard. And hey, it pains me to say that as a developer, because I think the dev tools are the suck (more on that in a minute). But again, consumers don't care. They don't even pay for Flash. The bottom line is that if I put something Flash based on the Internet, it's likely that my audience will see it.And what about the speed of standards owned by a company? Look no further than Silverlight. Silverlight 2 (which I consider the "real" start to the story) came out about a year and a half ago. Now version 4 is out, and it has come a very long way in its capabilities. If you believe Riastats.com, more than half of browsers have it now. It didn't have to wait for standards bodies and nerds drafting documents, it's out today. At this rate, Silverlight will be on version 6 or 7 by the time HTML 5 is a ratified standard.Back to the noise, one of the things that has continually disappointed me about this profession is the number of people who get stuck in an intellectual bubble, color it with dogmatic principles, and completely ignore the actual marketplace where this stuff all has to live. We aren't machines; Binary thinking that forces us to choose between "open standards" and "proprietary lock-in" (the most loaded b.s. FUD term evar) isn't smart at all. The truth is that the <object> tag has allowed us to build incredible stuff on top of the old standards, and consumers have benefitted greatly. Consumer desire, capitalism, and yes, standards ratified by nerds who think about this stuff for years have all played a role in the broad adoption of the Interwebs.We could all do without the noise. At the end of the day, I'm going to build stuff for the Web that's good for my users, and I'm not going to base my decisions on a techie bubble religion. Imagine what the brilliant minds behind the noise could do for the Web if they joined me in that pursuit.

    Read the article

  • Oracle WebCenter at the Enterprise 2.0 Conference

    - by Brian Dirking
    We had a great week at the E20 Conference, presenting in four sessions – Andy MacMillan gave a session titled Today’s Successful Enterprises are Social Enterprises and was on a panel that Tony Byrne moderated; Christian Finn spoke on a panel on Unified Communications Unified Communications + Social Computing = Best of Both Worlds?, Mark Bennett spoke on a panel on The Evolution of Talent Management. The key areas of focus this year were sentiment analysis, adoption and community building, the benefits of failure, and social’s role in process applications. Sentiment analysis. This was focused not on external audiences but more on employee sentiment. Tim Young showed his internal "NikoNiko" project, where employees use smilies to report their current mood. The result was a dashboard that showed the company mood by department. Since the goal is to improve productivity, people can see which departments are running into issues and try and address them. A company might otherwise wait until the end of the quarter financials to find out that there was a problem and product didn’t ship. This is a way to identify issues immediately. Tim is great – he had the crowd laughing as soon as he hit the stage, with his proposed hastag for his session: by making it 138 characters long, people couldn’t say much behind his back. And as I tweeted during his session, I loved his comment that complexity diffuses energy - it sounds like something Sun Tzu would say. Another example of employee sentiment analysis was CubeVibe. Founder and CEO Aaron Aycock, in his 3 minute pitch or die session talked about how engaged employees perform better. It was too bad he got gonged, he was just picking up speed, but CubeVibe did win the vote – congratulations to them. Internal adoption, community building, and involvement. On this topic I spoke to Terri Griffith, and she said there is some good work going on at University of Indiana regarding this, and hinted that she might be blogging about it in the near future. This area holds lots of interest for me. Amongst our customers, - CPAC stands out as an organization that has successfully built a community. So, I wonder - what are the building blocks? A strong leader? A common or unifying purpose? A certain level of engagement? I imagine someone has created an equation that says “for a community to grow at 30% per month, there must be an engagement level x to the square root of y, where x equals current community size, and y equals the expected growth rate, and the result is how many engagements the average user must contribute to maintain that growth.” Does anyone have a framework like that? The net result of everyone’s experience is that there is nothing to do but start early and fail often. Kevin Jones made this the focus of his keynote. He talked about the types of failure and what they mean. And he showed his famous kids at work video: Kevin’s blog also has this post: Social Business Failure #8: Workflow Integration. This is something that we’ve been working on at Oracle. Since so much of business is based in enterprise applications such as ERP and CRM (and since Oracle offers e-Business Suite, Siebel, PeopleSoft, and JD Edwards, as well as Fusion Applications), it makes sense that the social capabilities of Oracle WebCenter is built right into these applications. There are two types of social collaboration – ad-hoc, and exception handling. When you are in a business process and encounter an exception, you immediately look for 1) the document that tells you how to handle it, or 2) the person who can tell you how to handle it. With WebCenter built into these processes, people either search their content management system, or engage in expertise location and conversation. The great thing is, THEY DON’T HAVE TO LEAVE THE APPLICATION TO DO IT. Oracle has built the social capabilities right into the applications and business processes. I don’t think enough folks were able to see that at the event, but I expect that over the next six months folks will become very aware of it. WebCenter also provides the ability to have ad-hoc collaboration, search, and expertise location that folks need when they are innovating or collaborating. We demonstrated Oracle Social Network. It’s built on our Oracle WebCenter product to provide social collaboration inside and outside of your company. When we showed it to people, there were a number of areas that they commented on that were different from the other products being shown at the conference: Screenshots from within the product Many authors working on documents simultaneously Flagging people for follow up Direct ability to call out to people Ability to see presence not just if someone is online, but which conversation they are actively in Great stuff, the conference was full of smart people that that we enjoy spending time with. We’ll keep up in the meantime, but we look forward to seeing you in Boston.

    Read the article

  • Package Manager Console For More Than Managing Packages

    - by Steve Michelotti
    Like most developers, I prefer to not have to pick up the mouse if I don’t have to. I use the Executor launcher for almost everything so it’s extremely rare for me to ever click the “Start” button in Windows. I also use shortcuts keys when I can so I don’t have to pick up the mouse. By now most people know that the Package Manager Console that comes with NuGet is PowerShell embedded inside of Visual Studio. It is based on its PowerConsole predecessor which was the first (that I’m aware of) to embed PowerShell inside of Visual Studio and give access to the Visual Studio automation DTE object. It does this through an inherent $dte variable that is automatically available and ready for use. This variable is also available inside of the NuGet Package Manager console. Adding a new class file to a Visual Studio project is one of those mundane tasks that should be easier. First I have to pick up the mouse. Then I have to right-click where I want it file to go and select “Add –> New Item…” or “Add –> Class…”   If you know the Ctrl+Shift+A shortcut, then you can avoid the mouse for adding a new item but you have to manually assign a shortcut for adding a new class. At this point it pops up a dialog just so I can enter the name of the class I want. Since this is one of the most common tasks developers do, I figure there has to be an easier way and a way that avoids picking up the mouse and popping up dialogs. This is where your embedded PowerShell prompt in Visual Studio comes in. The first thing you should do is to assign a keyboard shortcut so that you can get a PowerShell prompt (i.e., the Package Manager console) quickly without ever picking up the mouse. I assign “Ctrl+P, Ctrl+M” because “P + M” stands for “Package Manager” so it is easy to remember:   At this point I can type this command to add a new class: PM> $dte.ItemOperations.AddNewItem("Code\Class", "Foo.cs") which will result in the class being added: At this point I’ve satisfied my original goal of not having to pick up a mouse and not having the “Add New Item” dialog pop up. However, having to remember that $dte method call is not very user-friendly at all. The best thing to do is to make this a re-usable function that always loads when Visual Studio starts up. There is a $profile variable that you can use to figure out where that location is for your machine: PM> $profile C:\Users\steve.michelotti\Documents\WindowsPowerShell\NuGet_profile.ps1 If the NuGet_profile.ps1 file does not already exist, you can just create it yourself and place it in the directory. Now you can put a function inside like this: 1: function addClass($className) 2: { 3: if ($className.EndsWith(".cs") -eq $false) { 4: $className = $className + ".cs" 5: } 6: 7: $dte.ItemOperations.AddNewItem("Code\Class", $className) 8: } Since it’s in the NuGet_profile.ps1 file, this function will automatically always be available for me after starting Visual Studio. Now I can simply do this: PM> addClass Foo At this point, we have a *very* nice developer experience. All I did to add a new class was: “Ctrl-P, Ctrl-M”, then “addClass Foo”. No mouse, no pop up dialogs, no complex commands to remember. In fact, PowerShell gives you auto-completion as well. If I type “addc” followed by [TAB], then intellisense pops up: You can see my custom function appear in intellisense above. Now I can type the next letter “c” and [TAB] to auto-complete the command. And if that’s still too many key strokes for you, then you can create your own PowerShell custom alias for your function like this: PM> Set-Alias addc addClass PM> addc Foo While all this is very useful, I did run into some issues which prompted me to make even further customization. This command will add the new class file to the current active directory. Depending on your context, this may not be what you want. For example, by convention all view model objects go in the “Models” folder in an MVC project. So if the current document is in the Controllers folder, it will add your class to that folder which is not what you want. You want it to always add it to the “Models” folder if you are adding a new model in an MVC project. For this situation, I added a new function called “addModel” which looks like this: 1: function addModel($className) 2: { 3: if ($className.EndsWith(".cs") -eq $false) { 4: $className = $className + ".cs" 5: } 6: 7: $modelsDir = $dte.ActiveSolutionProjects[0].UniqueName.Replace(".csproj", "") + "\Models" 8: $dte.Windows.Item([EnvDTE.Constants]::vsWindowKindSolutionExplorer).Activate() 9: $dte.ActiveWindow.Object.GetItem($modelsDir).Select([EnvDTE.vsUISelectionType]::vsUISelectionTypeSelect) 10: $dte.ItemOperations.AddNewItem("Code\Class", $className) 11: } First I figure out the path to the Models directory on line #7. Then I activate the Solution Explorer window on line #8. Then I make sure the Models directory is selected so that my context is correct when I add the new class and it will be added to the Models directory as desired. These are just a couple of examples for things you can do with the PowerShell prompt that you have available in the Package Manager console. As developers we spend so much time in Visual Studio, why would you not customize it so that you can work in whatever way you want to work?! The next time you’re not happy about the way Visual Studio makes you do a particular task – automate it! The sky is the limit.

    Read the article

  • Running a Mongo Replica Set on Azure VM Roles

    - by Elton Stoneman
    Originally posted on: http://geekswithblogs.net/EltonStoneman/archive/2013/10/15/running-a-mongo-replica-set-on-azure-vm-roles.aspxSetting up a MongoDB Replica Set with a bunch of Azure VMs is straightforward stuff. Here’s a step-by-step which gets you from 0 to fully-redundant 3-node document database in about 30 minutes (most of which will be spent waiting for VMs to fire up). First, create yourself 3 VM roles, which is the minimum number of nodes you need for high availability. You can use any OS that Mongo supports. This guide uses Windows but the only difference will be the mechanism for starting the Mongo service when the VM starts (Windows Service, daemon etc.) While the VMs are provisioning, download and install Mongo locally, so you can set up the replica set with the Mongo shell. We’ll create our replica set from scratch, doing one machine at a time (if you have a single node you want to upgrade to a replica set, it’s the same from step 3 onwards): 1. Setup Mongo Log into the first node, download mongo and unzip it to C:. Rename the folder to remove the version – so you have c:\MongoDB\bin etc. – and create a new folder for the logs, c:\MongoDB\logs. 2. Setup your data disk When you initialize a node in a replica set, Mongo pre-allocates a whole chunk of storage to use for data replication. It will use up to 5% of your data disk, so if you use a Windows VM image with a defsault 120Gb disk and host your data on C:, then Mongo will allocate 6Gb for replication. And that takes a while. Instead you can create yourself a new partition by shrinking down the C: drive in Computer Management, by say 10Gb, and then creating a new logical disk for your data from that spare 10Gb, which will be allocated as E:. Create a new folder, e:\data. 3. Start Mongo When that’s done, start a command line, point to the mongo binaries folder, install Mongo as a Windows Service, running in replica set mode, and start the service: cd c:\mongodb\bin mongod -logpath c:\mongodb\logs\mongod.log -dbpath e:\data -replSet TheReplicaSet –install net start mongodb 4. Open the ports Mongo uses port 27017 by default, so you need to allow access in the machine and in Azure. In the VM, open Windows Firewall and create a new inbound rule to allow access via port 27017. Then in the Azure Management Console for the VM role, under the Configure tab add a new rule, again to allow port 27017. 5. Initialise the replica set Start up your local mongo shell, connecting to your Azure VM, and initiate the replica set: c:\mongodb\bin\mongo sc-xyz-db1.cloudapp.net rs.initiate() This is the bit where the new node (at this point the only node) allocates its replication files, so if your data disk is large, this can take a long time (if you’re using the default C: drive with 120Gb, it may take so long that rs.initiate() never responds. If you’re sat waiting more than 20 minutes, start another instance of the mongo shell pointing to the same machine to check on it). Run rs.conf() and you should see one node configured. 6. Fix the host name for the primary – *don’t miss this one* For the first node in the replica set, Mongo on Windows doesn’t populate the full machine name. Run rs.conf() and the name of the primary is sc-xyz-db1, which isn’t accessible to the outside world. The replica set configuration needs the full DNS name of every node, so you need to manually rename it in your shell, which you can do like this: cfg = rs.conf() cfg.members[0].host = ‘sc-xyz-db1.cloudapp.net:27017’ rs.reconfig(cfg) When that returns, rs.conf() will have your full DNS name for the primary, and the other nodes will be able to connect. At this point you have a working database, so you can start adding documents, but there’s no replication yet. 7. Add more nodes For the next two VMs, follow steps 1 through to 4, which will give you a working Mongo database on each node, which you can add to the replica set from the shell with rs.add(), using the full DNS name of the new node and the port you’re using: rs.add(‘sc-xyz-db2.cloudapp.net:27017’) Run rs.status() and you’ll see your new node in STARTUP2 state, which means its initializing and replicating from the PRIMARY. Repeat for your third node: rs.add(‘sc-xyz-db3.cloudapp.net:27017’) When all nodes are finished initializing, you will have a PRIMARY and two SECONDARY nodes showing in rs.status(). Now you have high availability, so you can happily stop db1, and one of the other nodes will become the PRIMARY with no loss of data or service. Note – the process for AWS EC2 is exactly the same, but with one important difference. On the Azure Windows Server 2012 base image, the MongoDB release for 64-bit 2008R2+ works fine, but on the base 2012 AMI that release keeps failing with a UAC permission error. The standard 64-bit release is fine, but it lacks some optimizations that are in the 2008R2+ version.

    Read the article

  • WebCenter Content shared folders for clustering

    - by Kyle Hatlestad
    When configuring a WebCenter Content (WCC) cluster, one of the things which makes it unique from some other WebLogic Server applications is its requirement for a shared file system.  This is actually not any different then 10g and previous versions of UCM when it ran directly on a JVM.  And while it is simple enough to say it needs a shared file system, there are some crucial details in how those directories are configured. And if they aren't followed, you may result in some unwanted behavior. This blog post will go into the details on how exactly the file systems should be split and what options are required. Beyond documents being stored on the file system and/or database and metadata being stored in the database along with other structured data, there is other information being read and written to on the file system.  Information such as user profile preferences, workflow item state information, metadata profiles, and other details are stored in files.  In addition, for certain processes within WCC, each of the nodes needs to know what the other nodes are doing so they don’t step on each other.  WCC keeps track of this through the use of lock files on the file system.  Because of this, each node of the WCC must have access to the same file system just as they have access to the same database. WCC uses its own locking mechanism using files, so it also needs to have access to those files without file attribute caching and without locking being done by the client (node).  If one of the nodes accesses a certain status file and it happens to be cached, that node might attempt to run a process which another node is already working on.  Or if a particular file is locked by one of the node clients, this could interfere with access by another node.  Unfortunately, when disabling file attribute caching on the file share, this can impact performance.  So it is important to only disable caching and locking on the particular folders which require it.  When configuring WebCenter Content after deploying the domain, it asks for 3 different directories: Content Server Instance Folder, Native File Repository Location, and Weblayout Folder.  And starting in PS5, it now asks for the User Profile Folder. Even if you plan on storing the content in the database, you still need to establish a Native File (Vault) and Weblayout directories.  These will be used for handling temporary files, cached files, and files used to deliver the UI. For these directories, the only folder which needs to have the file attribute caching and locking disabled is the ‘Content Server Instance Folder’.  So when establishing this share through NFS or a clustered file system, be sure to specify those options. For instance, if creating the share through NFS, use the ‘noac’ and ‘nolock’ options for the mount options. For the other directories, caching and locking should be enabled to provide best performance to those locations.   These directory path configurations are contained within the <domain dir>\ucm\cs\bin\intradoc.cfg file: #Server System PropertiesIDC_Id=UCM_server1 #Server Directory Variables IdcHomeDir=/u01/fmw/Oracle_ECM1/ucm/idc/ FmwDomainConfigDir=/u01/fmw/user_projects/domains/base_domain/config/fmwconfig/ AppServerJavaHome=/u01/jdk/jdk1.6.0_22/jre/ AppServerJavaUse64Bit=true IntradocDir=/mnt/share_no_cache/base_domain/ucm/cs/ VaultDir=/mnt/share_with_cache/ucm/cs/vault/ WeblayoutDir=/mnt/share_with_cache/ucm/cs/weblayout/ #Server Classpath variables #Additional Variables #NOTE: UserProfilesDir is only available in PS5 – 11.1.1.6.0UserProfilesDir=/mnt/share_with_cache/ucm/cs/data/users/profiles/ In addition to these folder configurations, it’s also recommended to move node-specific folders to local disk to avoid unnecessary traffic to the shared directory.  So on each node, go to <domain dir>\ucm\cs\bin\intradoc.cfg and add these additional configuration entries: VaultTempDir=<domain dir>/ucm/<cs>/vault/~temp/ TraceDirectory=<domain dir>/servers/<UCM_serverN>/logs/EventDirectory=<domain dir>/servers/<UCM_serverN>/logs/event/ And of course, don’t forget the cluster-specific configuration values to add as well.  These can be added through Admin Server -> General Configuration -> Additional Configuration Variables or directly in the <IntradocDir>/config/config.cfg file: ArchiverDoLocks=true DisableSharedCacheChecking=true ServiceAllowRetry=true    (use only with Oracle RAC Database)PublishLockTimeout=300000  (time can vary depending on publishing time and number of nodes) For additional information and details on clustering configuration, I highly recommend reviewing document [1209496.1] on the support site.  In addition, there is a great step-by-step guide on setting up a WebCenter Content cluster [1359930.1].

    Read the article

  • Windows Azure Recipe: Software as a Service (SaaS)

    - by Clint Edmonson
    The cloud was tailor built for aspiring companies to create innovative internet based applications and solutions. Whether you’re a garage startup with very little capital or a Fortune 1000 company, the ability to quickly setup, deliver, and iterate on new products is key to capturing market and mind share. And if you can capture that share and go viral, having resiliency and infinite scale at your finger tips is great peace of mind. Drivers Cost avoidance Time to market Scalability Solution Here’s a sketch of how a basic Software as a Service solution might be built out: Ingredients Web Role – this hosts the core web application. Each web role will host an instance of the software and as the user base grows, additional roles can be spun up to meet demand. Access Control – this service is essential to managing user identity. It’s backed by a full blown implementation of Active Directory and allows the definition and management of users, groups, and roles. A pre-built ASP.NET membership provider is included in the training kit to leverage this capability but it’s also flexible enough to be combined with external Identity providers including Windows LiveID, Google, Yahoo!, and Facebook. The provider model provides extensibility to hook into other industry specific identity providers as well. Databases – nearly every modern SaaS application is backed by a relational database for its core operational data. If the solution is sold to organizations, there’s a good chance multi-tenancy will be needed. An emerging best practice for SaaS applications is to stand up separate SQL Azure database instances for each tenant’s proprietary data to ensure isolation from other tenants. Worker Role – this is the best place to handle autonomous background processing such as data aggregation, billing through external services, and other specialized tasks that can be performed asynchronously. Placing these tasks in a worker role frees the web roles to focus completely on user interaction and data input and provides finer grained control over the system’s scalability and throughput. Caching (optional) – as a web site traffic grows caching can be leveraged to keep frequently used read-only, user specific, and application resource data in a high-speed distributed in-memory for faster response times and ultimately higher scalability without spinning up more web and worker roles. It includes a token based security model that works alongside the Access Control service. Blobs (optional) – depending on the nature of the software, users may be creating or uploading large volumes of heterogeneous data such as documents or rich media. Blob storage provides a scalable, resilient way to store terabytes of user data. The storage facilities can also integrate with the Access Control service to ensure users’ data is delivered securely. Training & Examples These links point to online Windows Azure training labs and examples where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. Developing Applications for the Cloud, 2nd Edition (eBook) This book demonstrates how you can create from scratch a multi-tenant, Software as a Service (SaaS) application to run in the cloud using the latest versions of the Windows Azure Platform and tools. The book is intended for any architect, developer, or information technology (IT) professional who designs, builds, or operates applications and services that run on or interact with the cloud. Fabrikam Shipping (SaaS reference application) This is a full end to end sample scenario which demonstrates how to use the Windows Azure platform for exposing an application as a service. We developed this demo just as you would: we had an existing on-premises sample, Fabrikam Shipping, and we wanted to see what it would take to transform it in a full subscription based solution. The demo you find here is the result of that investigation See my Windows Azure Resource Guide for more guidance on how to get started, including more links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

< Previous Page | 167 168 169 170 171 172 173 174 175 176 177 178  | Next Page >