Search Results

Search found 4879 results on 196 pages for 'geeks'.

Page 106/196 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • Thank you South Florida for a successful SPSouthFLA

    - by Leonard Mwangi
    I wanted to officially thank the organizers, speakers, volunteers and the attendees of SharePoint Saturday South Florida. Being the first event in South Florida the reception was phenomenon and the group of speakers from keynote by Joel Oleson to session’s speakers from well renowned speakers like John Holliday, Randy Disgrill, Richard Harbridge, Ameet Phadnis, Fabian Williams, Chris McNulty, Jaime Velez to organizers like Michael Hinckley amongst others. With my Business Intelligence (BI) presentation being on the last track of the day, I spent very quality time networking with these great guys and getting the insider scope on International SharePoint Community from Joel and his son which was mesmerizing. I had a very active audience to a point where we couldn’t accommodate all the contents within the 1hr allocated time because they were very engaged and wanted a deep dive session on news features like PowerPivot, enhancements on PerformancePoint, Excel Services amongst others in order to understand the business value and how SharePoint 2010 is making the self-service BI become a reality. These community events allows the attendees experience technology first hand and network with MVPs, authors, experts providing high quality educational sessions usually for free which is a reason to attend. I have made the slides for my session available for download for those interested http://goo.gl/VaH5x

    Read the article

  • Challenge Accepted

    - by Chris Gardner
    Originally posted on: http://geekswithblogs.net/freestylecoding/archive/2014/05/20/challenge-accepted.aspxIt appears my good buddies in The Krewe have created The Krewe Summer Blogging Challenge. The challenge is to write at least two blog posts a week for 12 weeks over the summer. Consider this challenge accepted. So, what can we expect coming up? I still have the Kinect v2 Alpha kit. Some of you may have seen me use it in talks. I need to make some major API changes in The Krewe WP8 App. Plus, I may have Xamarin on board to help with getting the app to the other platforms. I am determined to learn F#, and I'm taking all of you with me. I am teaching a college course this summer. I want to post some commentary on that side of training. I am sure some biometric stuff will come up. Anything else you guys may want. I have created tasks on my schedule to get a new blog post up no later than every Tuesday and Friday. We'll see how that goes. Wish me luck.

    Read the article

  • XenApp 6.5 – How to create and set a Policy using PowerShell

    - by Waclaw Chrabaszcz
    Originally posted on: http://geekswithblogs.net/Wchrabaszcz/archive/2013/06/20/xenapp-6.5--how-to-create-and-set-a-policy.aspxHere is my homework Add-PSSnapin -name Citrix.Common.* -ErrorAction SilentlyContinueNew-Item LocalFarmGpo:\User\MyPolicycd LocalFarmGpo:\User\MyPolicy\Settings\ICA\SecuritySet-ItemProperty .\MinimumEncryptionLevel State EnabledSet-ItemProperty .\MinimumEncryptionLevel Value Bits128cd LocalFarmGpo:\User\MyPolicy\Filters\WorkerGroupNew-Item -Name "All Servers" -Value "All Servers"Set-ItemProperty LocalFarmGpo:\User\MyPolicy -Name Priority -Value 2  So cute …

    Read the article

  • VSDB to SSDT Series : Introduction

    - by Etienne Giust
    At the office, we extensively use VS2010 SQL Server 2008 Database Projects and SQL Server 2008 Server Projects  in our Visual Studio 2010 solutions. With Visual Studio 2012, those types of projects are replaced by the  SQL Server Database Project  using the SSDT (SQL Server Data Tools) technology. I started investigating the shift from Visual Studio 2010 to Visual Studio 2012 and specifically what needs to be done concerning those database projects in terms of painless migration, continuous integration and standalone deployment. I will write my findings in a series of 4 short articles: Part 1 will be about the database projects migration process and the cleaning up that ensues Part 2 will be about creating SQL Server 2008 Server Projects equivalents with the new SSDT project type Part 3 will introduce a replacement to the vsdbcmd.exe command used for deployment in our continuous integration process Part 4 will explain how to create standalone packages of SSDT projects for deployment on non accessible servers (such as a production server)

    Read the article

  • Adventures in Windows 8: Placing items in a GridView with a ColumnSpan or RowSpan

    - by Laurent Bugnion
    Currently working on a Windows 8 app for an important client, I will be writing about small issues, tips and tricks, ideas and whatever occurs to me during the development and the integration of this app. When working with a GridView, it is quite common to use a VariableSizedWrapGrid as the ItemsPanel. This creates a nice flowing layout which will auto-adapt for various resolutions. This is ideal when you want to build views like the Windows 8 start menu. However immediately we notice that the Start menu allows to place items on one column (Smaller) or two columns (Larger). This switch happens through the AppBar. So how do we implement that in our app? Using ColumnSpan and RowSpan When you use a VariableSizedWrapGrid directly in your XAML, you can attach the VariableSizedWrapGrid.ColumnSpan and VariableSizedWrapGrid.RowSpan attached properties directly to an item to create the desired effect. For instance this code create this output (shown in Blend but it runs just the same): <VariableSizedWrapGrid ItemHeight="100" ItemWidth="100" Width="200" Orientation="Horizontal"> <Rectangle Fill="Purple" /> <Rectangle Fill="Orange" /> <Rectangle Fill="Yellow" VariableSizedWrapGrid.ColumnSpan="2" /> <Rectangle Fill="Red" VariableSizedWrapGrid.ColumnSpan="2" VariableSizedWrapGrid.RowSpan="2" /> <Rectangle Fill="Green" VariableSizedWrapGrid.RowSpan="2" /> <Rectangle Fill="Blue" /> <Rectangle Fill="LightGray" /> </VariableSizedWrapGrid> Using the VariableSizedWrapGrid as ItemsPanel When you use a GridView however, you typically bind the ItemsSource property to a collection, for example in a viewmodel. In that case, you want to be able to switch the ColumnSpan and RowSpan depending on properties on the item. I tried to find a way to bind the VariableSizedWrapGrid.ColumnSpan attached property on the GridView’s ItemContainerStyle template to an observable property on the item, but it didn’t work. Instead, I decided to use a StyleSelector to switch the GridViewItem’s style. Here’s how: First I added my two GridViews to my XAML as follows: <Page.Resources> <local:MainViewModel x:Key="Main" /> <DataTemplate x:Key="DataTemplate1"> <Grid Background="{Binding Brush}"> <TextBlock Text="{Binding BrushCode}" /> </Grid> </DataTemplate> </Page.Resources> <Page.DataContext> <Binding Source="{StaticResource Main}" /> </Page.DataContext> <Grid Background="{StaticResource ApplicationPageBackgroundThemeBrush}" Margin="20"> <Grid.ColumnDefinitions> <ColumnDefinition Width="Auto" /> <ColumnDefinition Width="*" /> </Grid.ColumnDefinitions> <GridView ItemsSource="{Binding Items}" ItemTemplate="{StaticResource DataTemplate1}" VerticalAlignment="Top"> <GridView.ItemsPanel> <ItemsPanelTemplate> <VariableSizedWrapGrid ItemHeight="150" ItemWidth="150" /> </ItemsPanelTemplate> </GridView.ItemsPanel> </GridView> <GridView Grid.Column="1" ItemsSource="{Binding Items}" ItemTemplate="{StaticResource DataTemplate1}" VerticalAlignment="Top"> <GridView.ItemsPanel> <ItemsPanelTemplate> <VariableSizedWrapGrid ItemHeight="100" ItemWidth="100" /> </ItemsPanelTemplate> </GridView.ItemsPanel> </GridView> </Grid> The MainViewModel looks like this: public class MainViewModel { public IList<Item> Items { get; private set; } public MainViewModel() { Items = new List<Item> { new Item { Brush = new SolidColorBrush(Colors.Red) }, new Item { Brush = new SolidColorBrush(Colors.Blue) }, new Item { Brush = new SolidColorBrush(Colors.Green), }, // And more... }; } } As for the Item class, I am using an MVVM Light ObservableObject but you can use your own simple implementation of INotifyPropertyChanged of course: public class Item : ObservableObject { public const string ColSpanPropertyName = "ColSpan"; private int _colSpan = 1; public int ColSpan { get { return _colSpan; } set { Set(ColSpanPropertyName, ref _colSpan, value); } } public SolidColorBrush Brush { get; set; } public string BrushCode { get { return Brush.Color.ToString(); } } } Then I copied the GridViewItem’s style locally. To do this, I use Expression Blend’s functionality. It has the disadvantage to copy a large portion of XAML to your application, but the HUGE advantage to allow you to change the look and feel of your GridViewItem everywhere in the application. For example, you can change the selection chrome, the item’s alignments and many other properties. Actually everytime I use a ListBox, ListView or any other data control, I typically copy the item style to a resource dictionary in my application and I tweak it. Note that Blend for Windows 8 apps is automatically installed with every edition of Visual Studio 2012 (including Express) so you have no excuses anymore not to use Blend :) Open MainPage.xaml in Expression Blend by right clicking on the MainPage.xaml file in the Solution Explorer and selecting Open in Blend from the context menu. Note that the items do not look very nice! The reason is that the default ItemContainerStyle sets the content’s alignment to “Center” which I never quite understood. Seems to me that you rather want the content to be stretched, but anyway it is easy to change.   Right click on the GridView on the left and select Edit Additional Templates, Edit Generated Item Container (ItemContainerStyle), Edit a Copy. In the Create Style Resource dialog, enter the name “DefaultGridViewItemStyle”, select “Application” and press OK. Side note 1: You need to save in a global resource dictionary because later we will need to retrieve that Style from a global location. Side note 2": I would rather copy the style to an external resource dictionary that I link into the App.xaml file, but I want to keep things simple here. Blend switches in Template edit mode. The template you are editing now is inside the ItemContainerStyle and will govern the appearance of your items. This is where, for instance, the “checked” chrome is defined, and where you can alter it if you need to. Note that you can reuse this style for all your GridViews even if you use a different DataTemplate for your items. Makes sense? I probably need to think about writing another blog post dedicated to the ItemContainerStyle :) In the breadcrumb bar on top of the page, click on the style icon. The property we want to change now can be changed in the Style instead of the Template, which is a better idea. Blend is not in Style edit mode, as you can see in the Objects and Timeline pane. In the Properties pane, in the Search box, enter the word “content”. This will filter all the properties containing that partial string, including the two we are interested in: HorizontalContentAlignment and VerticalContentAlignment. Set these two values to “Stretch” instead of the default “Center”. Using the breadcrumb bar again, set the scope back to the Page (by clicking on the first crumb on the left). Notice how the items are now showing as squares in the first GridView. We will now use the same ItemContainerStyle for the second GridView. To do this, right click on the second GridView and select Edit Additional Templates, Edit Generate Item Container, Apply Resource, DefaultGridViewItemStyle. The page now looks nicer: And now for the ColumnSpan! So now, let’s change the ColumnSpan property. First, let’s define a new Style that inherits the ItemContainerStyle we created before. Make sure that you save everything in Blend by pressing Ctrl-Shift-S. Open App.xaml in Visual Studio. Below the newly created DefaultGridViewItemStyle resource, add the following style: <Style x:Key="WideGridViewItemStyle" TargetType="GridViewItem" BasedOn="{StaticResource DefaultGridViewItemStyle}"> <Setter Property="VariableSizedWrapGrid.ColumnSpan" Value="2" /> </Style> Add a new class to the project, and name it MainItemStyleSelector. Implement the class as follows: public class MainItemStyleSelector : StyleSelector { protected override Style SelectStyleCore(object item, DependencyObject container) { var i = (Item)item; if (i.ColSpan == 2) { return Application.Current.Resources["WideGridViewItemStyle"] as Style; } return Application.Current.Resources["DefaultGridViewItemStyle"] as Style; } } In MainPage.xaml, add a resource to the Page.Resources section: <local:MainItemStyleSelector x:Key="MainItemStyleSelector" /> In MainPage.xaml, replace the ItemContainerStyle property on the first GridView with the ItemContainerStyleSelector property, pointing to the StaticResource we just defined. <GridView ItemsSource="{Binding Items}" ItemTemplate="{StaticResource DataTemplate1}" VerticalAlignment="Top" ItemContainerStyleSelector="{StaticResource MainItemStyleSelector}"> <GridView.ItemsPanel> <ItemsPanelTemplate> <VariableSizedWrapGrid ItemHeight="150" ItemWidth="150" /> </ItemsPanelTemplate> </GridView.ItemsPanel> </GridView> Do the same for the second GridView as well. Finally, in the MainViewModel, change the ColumnSpan property on the 3rd Item to 2. new Item { Brush = new SolidColorBrush(Colors.Green), ColSpan = 2 }, Running the application now creates the following image, which is what we wanted. Notice how the green item is now a “wide tile”. You can also experiment by creating different Styles, all inheriting the DefaultGridViewItemStyle and using different values of RowSpan for instance. This will allow you to create any layout you want, while leaving the heavy lifting of “flowing the layout” to the GridView control. What about changing these values dynamically? Of course as we can see in the Start menu, it would be nice to be able to change the ColumnSpan and maybe even the RowSpan values at runtime. Unfortunately at this time I have not found a good way to do that. I am investigating however and will make sure to post a follow up when I find what I am looking for!   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Backup Azure Tables with the Enzo Backup API

    - by Herve Roggero
    In case you missed it, you can now backup (and restore) Azure Tables and SQL Databases using an API directly. The features available through the API can be found here: http://www.bluesyntax.net/backup20api.aspx and the online help for the API is here: http://www.bluesyntax.net/EnzoCloudBackup20/APIIntro.aspx. Backing up Azure Tables can’t be any easier than with the Enzo Backup API. Here is a sample code that does the trick: // Create the backup helper class. The constructor automatically sets the SourceStorageAccount property StorageBackupHelper backup = new StorageBackupHelper("storageaccountname", "storageaccountkey", "sourceStorageaccountname", "sourceStorageaccountkey", true, "apilicensekey"); // Now set some properties… backup.UseCloudAgent = false;                                       // backup locally backup.DeviceURI = @"c:\TMP\azuretablebackup.bkp";    // to this file backup.Override = true; backup.Location = DeviceLocation.LocalFile; // Set optional performance options backup.PKTableStrategy.Mode = BSC.Backup.API.TableStrategyMode.GUID; // Set GUID strategy by default backup.MaxRESTPerSec = 200; // Attempt to stay below 200 REST calls per second // Start the backup now… string taskId = backup.Backup(); // Use the Environment class to get the final status of the operation EnvironmentHelper env = new EnvironmentHelper("storageaccountname", "storageaccountkey", "apilicensekey"); string status = env.GetOperationStatus(taskId);   As you can see above, the code is straightforward. You provide connection settings in the constructor, set a few options indicating where the backup device will be located, set optional performance parameters and start the backup. The performance options are designed to help you backup your Azure Tables quickly, while attempting to keep under a specific threshold to prevent Storage Account throttling. For example, the MaxRESTPerSec property will attempt to keep the overall backup operation under 200 rest calls per second. Another performance option if the Backup Strategy for Azure Tables. By default, all tables are simply scanned. While this works best for smaller Azure Tables, larger tables can use the GUID strategy, which will issue requests against an Azure Table in parallel assuming the PartitionKey stores GUID values. It doesn’t mean that your PartitionKey must have GUIDs however for this strategy to work; but the backup algorithm is tuned for this condition. Other options are available as well, such as filtering which columns, entities or tables are being backed up. Check out more on the Blue Syntax website at http://www.bluesyntax.net.

    Read the article

  • Why We Need UX Designers

    - by Tim Murphy
    Ok, so maybe this is really why I need UX designers.  While I have always had an interest in photography and can appreciate a well designed user interface putting one together is an entirely different endeavor.  Being color blind doesn’t help, but coming up with ideas is probably the biggest portion of the issue.  I can spot things that just don’t look like they work right, but what will? UX designers is an area that most companies do not spend much if any resources.  As they say, you only get one chance to make a first impression and and a poorly designed site or application is a bad first impression.  Given that they you would think that companies would invest more in appearance and usability. One of two things need to be done to rectify this issue.  Either we need to start educating our developers on user experience and design or we need to start finding ways to subsidize putting full time designers on our project teams.  Maybe it should be a time share type of situation, but something needs to be done.  As architects we need to impress on our project stakeholders the importance of User Experience and why it should be part of the budget.  If they hear it often enough eventually they may present it to you as their own idea. del.icio.us Tags: User Experience,UX,Application Design

    Read the article

  • New Enhanced Visual WebGui WINWEB and .NETHTML5 Versions

    - by Webgui
    After a long wait and huge anticipation from the Visual WebGui community, I am happy to announce the release of new versions for the WINWEB and .NETHTML5 branches. The new 6.4.0 Release d and 6.4.0 beta3 versions are available after an extensive work on core capabilities of Visual WebGui including extension of existing controls and adding new controls such as Strip Controls, RibbonBar, DataGridView, ComboBox, PropertyGrid and RadioButton, DataGridView, ComboBox, PropertyGrid and RadioButton, as well as some major enhancements to both versions in terms of cross-browser support and performance.We apologize for the delay in the release of those most expected versions, but we believe that the extra time lead to a more mature and complete product. As you can see the changelog is pretty long and includes a list of enhancements, new features and bug fixes: http://visualwebgui.com/Developers/KB/tabid/654/article/w_changelogs/Default.aspx The new versions are available for all versions with open source and for The new versions are available for all versions with open sources for Visual Studio 2005, 2008 and 2010. You are welcome to download the WINWEB Free Trial and the Free .NETHTML5 beta on the downloads page.

    Read the article

  • What label of tests are BizUnit tests?

    - by charlie.mott
    BizUnit is defined as a "Framework for Automated Testing of Distributed Systems.  However, I've never seen a catchy label to describe what sort of tests we create using this framework. They are not really “Unit Tests” that's for sure. "Integration Tests" might be a good definition, but I want a label that clearly separates it from the manual "System Integration Testing" phase of a project where real instances of the integrated systems are used. Among some colleagues, we brainstormed some suggestions: Automated Integration Tests Stubbed Integration Tests Sandbox Integration Tests Localised Integration Tests All give a good view of the sorts of tests that are being done. I think "Stubbed Integration Tests" is most catchy and descriptive. So I will use that until someone comes up with a better idea.

    Read the article

  • SQL Server-Determine which query is taking a long time to complete

    - by Neil Smith
    Cool little trick to determine which sql query which is taking a long time to execute, first while offending query is running from another machine do EXEC sp_who2 Locate the SPID responsible via Login, DBName and ProgramName columns, then do DBCC INPUTBUFFER (<SPID>) The offending query will be in the EventInfo column.  This is a great little time saver for me, before I found out about this I used to split my concatenated query script in to multiple sql files until I located the problem query

    Read the article

  • Custom Session Management using HashTable

    - by kaleidoscope
    ASP.NET session state lets you associate a server-side string or object dictionary containing state data with a particular HTTP client session. A session is defined as a series of requests issued by the same client within a certain period of time, and is managed by associating a session ID with each unique client. The ID is supplied by the client on each request, either in a cookie or as a special fragment of the request URL. The session data is stored on the server side in one of the supported session state stores, which include in-process memory, SQL Server™ database, and the ASP.NET State Server service. The latter two modes enable session state to be shared among multiple Web servers on a Web farm and do not require server affinity. Implement Custom session Handler you need to follow following process : 1. Create class library which will inherit from  SessionStateStoreProviderBase abstract Class. 2. Implement all abstract Method in your base class. 3.Change Mode of session to “Custom” in web.config file and provide Provider as your Namespace with classname. <sessionState mode=”Custom” customProvider=”Namespace.classname”> <Providers> <add name=”Name” type=”Namespace.classname”> </sessionstate> For more Details Please refer following links :   http://msdn.microsoft.com/en-us/magazine/cc163730.aspx http://msdn.microsoft.com/en-us/library/system.web.sessionstate.sessionstatestoreproviderbase.aspx - Chandraprakash, S Technorati Tags: Chandraprakash,Session state Managment

    Read the article

  • How to: Add an HTTPS Endpoint to a Windows Azure Cloud Service

    - by kaleidoscope
    Technorati Tags: Ritesh,Windows Azure,Endpoints,https The process to add an HTTPS endpoint is a 3 step process. Configure the endpoint Upload the certificate to the Cloud Configure the SSL certificate (and then point the endpoint to that certificate) Reference – http://blogs.msdn.com/jnak/archive/2009/12/01/how-to-add-an-https-endpoint-to-a-windows-azure-cloud-service.aspx - Ritesh, D

    Read the article

  • Implementing the Reactive Manifesto with Azure and AWS

    - by Elton Stoneman
    Originally posted on: http://geekswithblogs.net/EltonStoneman/archive/2013/10/31/implementing-the-reactive-manifesto-with-azure-and-aws.aspxMy latest Pluralsight course, Implementing the Reactive Manifesto with Azure and AWS has just been published! I’d planned to do a course on dual-running a messaging-based solution in Azure and AWS for super-high availability and scale, and the Reactive Manifesto encapsulates exactly what I wanted to do. A “reactive” application describes an architecture which is inherently resilient and scalable, being event-driven at the core, and using asynchronous communication between components. In the course, I compare that architecture to a classic n-tier approach, and go on to build out an app which exhibits all the reactive traits: responsive, event-driven, scalable and resilient. I use a suite of technologies which are enablers for all those traits: ASP.NET SignalR for presentation, with server push notifications to the user Messaging in the middle layer for asynchronous communication between presentation and compute Azure Service Bus Queues and Topics AWS Simple Queue Service AWS Simple Notification Service MongoDB at the storage layer for easy HA and scale, with minimal locking under load. Starting with a couple of console apps to demonstrate message sending, I build the solution up over 7 modules, deploying to Azure and AWS and running the app across both clouds concurrently for the whole stack - web servers, messaging infrastructure, message handlers and database servers. I demonstrating failover by killing off bits of infrastructure, and show how a reactive app deployed across two clouds can survive machine failure, data centre failure and even whole cloud failure. The course finishes by configuring auto-scaling in AWS and Azure for the compute and presentation layers, and running a load test with blitz.io. The test pushes masses of load into the app, which is deployed across four data centres in Azure and AWS, and the infrastructure scales up seamlessly to meet the load – the blitz report is pretty impressive: That’s 99.9% success rate for hits to the website, with the potential to serve over 36,000,000 hits per day – all from a few hours’ build time, and a fairly limited set of auto-scale configurations. When the load stops, the infrastructure scales back down again to a minimal set of servers for high availability, so the app doesn’t cost much to host unless it’s getting a lot of traffic. This is my third course for Pluralsight, with Nginx and PHP Fundamentals and Caching in the .NET Stack: Inside-Out released earlier this year. Now that it’s out, I’m starting on the fourth one, which is focused on C#, and should be out by the end of the year.

    Read the article

  • Send Multiple InMemory Attachments Using FileUpload Controls

    - by bullpit
    I wanted to give users an ability to send multiple attachments from the web application. I did not want anything fancy, just a few FileUpload controls on the page and then send the email. So I dropped five FileUpload controls on the web page and created a function to send email with multiple attachments. Here’s the code: public static void SendMail(string fromAddress, string toAddress, string subject, string body, HttpFileCollection fileCollection)     {         // CREATE THE MailMessage OBJECT         MailMessage mail = new MailMessage();           // SET ADDRESSES         mail.From = new MailAddress(fromAddress);         mail.To.Add(toAddress);           // SET CONTENT         mail.Subject = subject;         mail.Body = body;         mail.IsBodyHtml = false;                        // ATTACH FILES FROM HttpFileCollection         for (int i = 0; i < fileCollection.Count; i++)         {             HttpPostedFile file = fileCollection[i];             if (file.ContentLength > 0)             {                 Attachment attachment = new Attachment(file.InputStream, Path.GetFileName(file.FileName));                 mail.Attachments.Add(attachment);             }         }           // SEND MESSAGE         SmtpClient smtp = new SmtpClient("127.0.0.1");         smtp.Send(mail);     } And here’s how you call the method: protected void uxSendMail_Click(object sender, EventArgs e)     {         HttpFileCollection fileCollection = Request.Files;         string fromAddress = "[email protected]";         string toAddress = "[email protected]";         string subject = "Multiple Mail Attachment Test";         string body = "Mail Attachments Included";         HelperClass.SendMail(fromAddress, toAddress, subject, body, fileCollection);            }

    Read the article

  • 2010 is gone and Welcome 2011

    - by anirudha
    last days i spent my week @ firozabadthe town is much small and near to agraso i never forget to see the taj mahal and red fort their even it’s first chance to see them.i make a plan that i go to Agra last Saturday. firstly i go to red fort and i talking with many foreigner and they love to talking with me because their is only one man who with with them who is their GUIDE a person like a  book they never can talk with you but tell you about everything of the location because you buy them. their are many person come from various country such as German , Japan,  Russ , Italy and many other. their is no problem to talk with them perhaps they happen with talk to me. when i completely watch the Red fort at least i see a girl who are look like a foreigner. i talk themselves where they come from they tell me Francewhen i go elsewhere i thing to propose them to be  a friend of mine. i never propose any girl for friendship with me even in school and college. so i propose them to be a friend of mine.  they accept it i put the email ID in their hand whenever they gone. but i still not get their mail. 2ndly i go to Taj mahal the taj experience is not so good i spent 3 or 4 hours in rush. i found their is no security even their are many army force. they all person are too slow to work. they spent 10 minute to check  a person for security . their hands work very slow just like a low configuration computer. i talk many person their too. i talk to a person who tell themselves Jacob and they from Chicago. they speak very fast and i not know what they tell in speech. a another problem i got with some Chinese person. when i talking with them that i found they speak only Chinese language. Wish you a very very happy new year.

    Read the article

  • Mapping Repeating Sequence Groups in BizTalk

    - by Paul Petrov
    Repeating sequence groups can often be seen in real life XML documents. It happens when certain sequence of elements repeats in the instance document. Here’s fairly abstract example of schema definition that contains sequence group: <xs:schemaxmlns:b="http://schemas.microsoft.com/BizTalk/2003"            xmlns:xs="http://www.w3.org/2001/XMLSchema"            xmlns="NS-Schema1"            targetNamespace="NS-Schema1" >  <xs:elementname="RepeatingSequenceGroups">     <xs:complexType>       <xs:sequencemaxOccurs="1"minOccurs="0">         <xs:sequencemaxOccurs="unbounded">           <xs:elementname="A"type="xs:string" />           <xs:elementname="B"type="xs:string" />           <xs:elementname="C"type="xs:string"minOccurs="0" />         </xs:sequence>       </xs:sequence>     </xs:complexType>  </xs:element> </xs:schema> And here’s corresponding XML instance document: <ns0:RepeatingSequenceGroupsxmlns:ns0="NS-Schema1">  <A>A1</A>  <B>B1</B>  <C>C1</C>  <A>A2</A>  <B>B2</B>  <A>A3</A>  <B>B3</B>  <C>C3</C> </ns0:RepeatingSequenceGroups> As you can see elements A, B, and C are children of anonymous xs:sequence element which in turn can be repeated N times. Let’s say we need do simple mapping to the schema with similar structure but with different element names: <ns0:Destinationxmlns:ns0="NS-Schema2">  <Alpha>A1</Alpha>  <Beta>B1</Beta>  <Gamma>C1</Gamma>  <Alpha>A2</Alpha>  <Beta>B2</Beta>  <Gamma>C2</Gamma> </ns0:Destination> The basic map for such typical task would look pretty straightforward: If we test this map without any modification it will produce following result: <ns0:Destinationxmlns:ns0="NS-Schema2">  <Alpha>A1</Alpha>  <Alpha>A2</Alpha>  <Alpha>A3</Alpha>  <Beta>B1</Beta>  <Beta>B2</Beta>  <Beta>B3</Beta>  <Gamma>C1</Gamma>  <Gamma>C3</Gamma> </ns0:Destination> The original order of the elements inside sequence is lost and that’s not what we want. Default behavior of the BizTalk 2009 and 2010 Map Editor is to generate compatible map with older versions that did not have ability to preserve sequence order. To enable this feature simply open map file (*.btm) in text/xml editor and find attribute PreserveSequenceOrder of the root <mapsource> element. Set its value to Yes and re-test the map: <ns0:Destinationxmlns:ns0="NS-Schema2">  <Alpha>A1</Alpha>  <Beta>B1</Beta>  <Gamma>C1</Gamma>  <Alpha>A2</Alpha>  <Beta>B2</Beta>  <Alpha>A3</Alpha>  <Beta>B3</Beta>  <Gamma>C3</Gamma> </ns0:Destination> The result is as expected – all corresponding elements are in the same order as in the source document. Under the hood it is achieved by using one common xsl:for-each statement that pulls all elements in original order (rather than using individual for-each statement per element name in default mode) and xsl:if statements to test current element in the loop:  <xsl:templatematch="/s0:RepeatingSequenceGroups">     <ns0:Destination>       <xsl:for-eachselect="A|B|C">         <xsl:iftest="local-name()='A'">           <Alpha>             <xsl:value-ofselect="./text()" />           </Alpha>         </xsl:if>         <xsl:iftest="local-name()='B'">           <Beta>             <xsl:value-ofselect="./text()" />           </Beta>         </xsl:if>         <xsl:iftest="local-name()='C'">           <Gamma>             <xsl:value-ofselect="./text()" />           </Gamma>         </xsl:if>       </xsl:for-each>     </ns0:Destination>  </xsl:template> BizTalk Map editor became smarter so learn and use this lesser known feature of XSLT 2.0 in your maps and XSL stylesheets.

    Read the article

  • SharePoint 2010 PowerShell Script to Find All SPShellAdmins with Database Name

    - by Brian Jackett
    Problem     Yesterday on Twitter my friend @cacallahan asked for some help on how she could get all SharePoint 2010 SPShellAdmin users and the associated database name.  I spent a few minutes and wrote up a script that gets this information and decided I’d post it here for others to enjoy.     Background     The Get-SPShellAdmin commandlet returns a listing of SPShellAdmins for the given database Id you pass in, or the farm configuration database by default.  For those unfamiliar, SPShellAdmin access is necessary for non-admin users to run PowerShell commands against a SharePoint 2010 farm (content and configuration databases specifically).  Click here to read an excellent guest post article my friend John Ferringer (twitter) wrote on the Hey Scripting Guy! blog regarding granting SPShellAdmin access.  Solution     Below is the script I wrote (formatted for space and to include comments) to provide the information needed. Click here to download the script.   # declare a hashtable to store results $results = @{}   # fetch databases (only configuration and content DBs are needed) $databasesToQuery = Get-SPDatabase | Where {$_.Type -eq 'Configuration Database' -or $_.Type -eq 'Content Database'}   # for each database get spshelladmins and add db name and username to result $databasesToQuery | ForEach-Object {$dbName = $_.Name; Get-SPShellAdmin -database $_.id | ForEach-Object {$results.Add($dbName, $_.username)}}   # sort results by db name and pipe to table with auto sizing of col width $results.GetEnumerator() | Sort-Object -Property Name | ft -AutoSize     Conclusion     In this post I provided a script that outputs all of the SPShellAdmin users and the associated database names in a SharePoint 2010 farm.  Funny enough it actually took me longer to boot up my dev VM and PowerShell (~3 mins) than it did to write the first working draft of the script (~2 mins).  Feel free to use this script and modify as needed, just be sure to give credit back to the original author.  Let me know if you have any questions or comments.  Enjoy!         -Frog Out   Links PowerShell Hashtables http://technet.microsoft.com/en-us/library/ee692803.aspx SPShellAdmin Access Explained http://blogs.technet.com/b/heyscriptingguy/archive/2010/07/06/hey-scripting-guy-tell-me-about-permissions-for-using-windows-powershell-2-0-cmdlets-with-sharepoint-2010.aspx

    Read the article

  • Given a database table where multiple rows have the same values and only the most recent record is to be returned

    - by Jim Lahman
    I have a table where there are multiple records with the same value but varying creation dates.  A sample of the database columns is shown here:   1: select lot_num, to_char(creation_dts,'DD-MON-YYYY HH24:MI:SS') as creation_date 2: from coil_setup 3: order by lot_num   LOT_NUM                        CREATION_DATE        ------------------------------ -------------------- 1435718.002                    24-NOV-2010 11:45:54 1440026.002                    17-NOV-2010 06:50:16 1440026.002                    08-NOV-2010 23:28:24 1526564.002                    01-DEC-2010 13:14:04 1526564.002                    08-NOV-2010 22:39:01 1526564.002                    01-NOV-2010 17:04:30 1605920.003                    29-DEC-2010 10:01:24 1945352.003                    14-DEC-2010 01:50:37 1945352.003                    09-DEC-2010 04:44:22 1952718.002                    25-OCT-2010 09:33:19 1953866.002                    20-OCT-2010 18:38:31 1953866.002                    18-OCT-2010 16:15:25   Notice that there are multiple instances of of the same lot number as shown in bold. To only return the most recent instance, issue this SQL statement: 1: select lot_num, to_char(creation_date,'DD-MON-YYYY HH24:MI:SS') as creation_date 2: from 3: ( 4: select rownum r, lot_num, max(creation_dts) as creation_date 5: from coil_setup group by rownum, lot_num 6: order by lot_num 7: ) 8: where r < 100  LOT_NUM                        CREATION_DATE        ------------------------------ -------------------- 2019416.002                    01-JUL-2010 00:01:24 2022336.003                    06-OCT-2010 15:25:01 2067230.002                    01-JUL-2010 00:36:48 2093114.003                    02-JUL-2010 20:10:51 2093982.002                    02-JUL-2010 14:46:11 2093984.002                    02-JUL-2010 14:43:18 2094466.003                    02-JUL-2010 20:04:48 2101074.003                    11-JUL-2010 09:02:16 2103746.002                    02-JUL-2010 15:07:48 2103758.003                    11-JUL-2010 09:02:13 2104636.002                    02-JUL-2010 15:11:25 2106688.003                    02-JUL-2010 13:55:27 2106882.003                    02-JUL-2010 13:48:47 2107258.002                    02-JUL-2010 12:59:48 2109372.003                    02-JUL-2010 20:49:12 2110182.003                    02-JUL-2010 19:59:19 2110184.003                    02-JUL-2010 20:01:03

    Read the article

  • Blogging from Office RT

    - by Dennis Vroegop
    During the last Build conference all attendees were given a brand new sparkling exciting Surface RT device (I love that machine despite its name but that's beside the point). On it came a version of Office 2013 RT, or better: the preview version. Now, I translated that term "Preview" to "Beta". Which is OK, since I've been using a lot of beta products from Microsoft and they all were great. And then I wanted to post a blogposting from Word. I knew I could, I have been doing this for a long time (I prefer Live Writer but that isn't available on Windows 8 RT). So I wrote the entry and hit "Publish". Instead of my blogsite I got a nice non-descriptive error telling me I couldn't post. So I fired up my other (Intel based) Win8 tablet, opened Word RT Preview, it loaded my blogpost (you've got to love the automatic synchronization through Skydrive) and tried from that machine. Same error. So, I installed Live Writer (remember, the other machine is Intel based) and posted from there. That worked like a charm. Apparently, there was something wrong with Word. I gave up and didn't think about it anymore. Yet… what you're reading now is written in Word 2013 RT on my Surface RT. So what did do? Simple: I updated from the Preview version to the final version. That's all there was to it. So…. If you're still on the preview I urge you to upgrade. You need to go to the "classic desktop update" window instead of going through the Windows Store App style update since Office is a desktop system, but once you do that you'll have the full version as well. Happy blogging!

    Read the article

  • It was a figure of speech!

    - by Ratman21
    Yesterday I posted the following as attention getter / advertisement (as well as my feelings). In the groups, (I am in) on the social networking site, LinkedIn and boy did I get responses.    I am fighting mad about (a figure of speech, really) not having a job! Look just because I am over 55 and have gray hair. It does not mean, my brain is dead or I can no longer trouble shoot a router or circuit or LAN issue. Or that I can do “IT” work at all. And I could prove this if; some one would give me at job. Come on try me for 90 days at min. wage. I know you will end up keeping me (hope fully at normal pay) around. Is any one hearing me…come on take up the challenge!     This was the responses I got.   I hear you. We just need to retrain and get our skills up to speed is all. That is what I am doing. I have not given up. Just got to stay on top of the game. Experience is on our side if we have the credentials and we are reasonable about our salaries this should not be an issue.   Already on it, going back to school and have got three certifications (CompTIA A+, Security+ and Network+. I am now studying for my CISCO CCNA certification. As to my salary, I am willing to work at very reasonable rate.   You need to re-brand yourself like a product, market and sell yourself. You need to smarten up, look and feel a million dollars, re-energize yourself, regain your confidents. Either start your own business, or re-write your CV so it stands out from the rest, get the template off the internet. Contact every recruitment agent in your town, state, country and overseas, and on the web. Apply to every job you think you could do, you may not get it but you will make a contact for your network, which may lead to a job at the end of the tunnel. Get in touch with everyone you know from past jobs. Do charity work. I maintain the IT Network, stage electrical and the Telecom equipment in my church,   Again already on it. I have email the world is seems with my resume and cover letters. So far, I have rewritten or had it rewrote, my resume and cover letters; over seven times so far. Re-energize? I never lost my energy level or my self-confidents in my work (now if could get some HR personal to see the same). I also volunteer at my church, I created and maintain the church web sit.   I share your frustration. Sucks being over 50 and looking for work. Please don't sell yourself short at min wage because the employer will think that’s your worth. Keep trying!!   I never stop trying and min wage is only for 90 days. If some one takes up the challenge. Some post asked if I am keeping up technology.   Do you keep up with the latest technology and can speak the language fluidly?   Yep to that and as to speaking it also a yep! I am a geek you know. I heard from others over the 50 year mark and younger too.   I'm with you! I keep getting told that I don't have enough experience because I just recently completed a Masters level course in Microsoft SQL Server, which gave me a project-intensive equivalent of between 2 and 3 years of experience. On top of that training, I have 19 years as an applications programmer and database administrator. I can normalize rings around experienced DBAs and churn out effective code with the best of them. But my 19 years is worthless as far as most recruiters and HR people are concerned because it is not the specific experience for which they're looking. HR AND RECRUITERS TAKE NOTE: Experience, whatever the language, translates across platforms and technology! By the way, I'm also over 55 and still have "got it"!   I never lost it and I also can work rings round younger techs.   I'm 52 and female and seem to be having the same issues. I have over 10 years experience in tech support (with a BS in CIS) and can't get hired either.   Ow, I only have an AS in computer science along with my certifications.   Keep the faith, I have been unemployed since August of 2008. I agree with you...I am willing to return to the beginning of my retail career and work myself back through the ranks, if someone will look past the grey and realize the knowledge I would bring to the table.   I also would like some one to look past the gray.   Interesting approach, volunteering to work for minimum wage for 90 days. I'm in the same situation as you, being 55 & balding w/white hair, so I know where you're coming from. I've been out of work now for a year. I'm in Michigan, where the unemployment rate is estimated to be 15% (the worst in the nation) & even though I've got 30+ years of IT experience ranging from mainframe to PC desktop support, it's difficult to even get a face-to-face interview. I had one prospective employer tell me flat out that I "didn't have the energy required for this position". Mostly I never get any feedback. All I can say is good luck & try to remain optimistic.   He said WHAT! Yes remaining optimistic is key. Along with faith in God. Then there was this (for lack of better word) jerk.   Give it up already. You were too old to work in high tech 10 years ago. Scratch that, 20 years ago! Try selling hot dogs in front of Fry's Electronics. At least you would get a chance to eat lunch with your previous colleagues....   You know funny thing on this person is that I checked out his profile. He is older than I am.

    Read the article

  • PublishingWeb.ExcludeFromNavigation Missing

    - by Michael Van Cleave
    So recently I have had to make the transition from the SharePoint 2007 codebase to SharePoint 2010 codebase. Needless to say there hasn't been much difficulty in the changeover. However in a set of code that I was playing around with and transitioning I did find one change that might cause some pain to others out there that have been programming against the PublishingWeb object in the Microsoft.SharePoint.Publishing namespace. The 2007 snippet of code that work just fine in 2007 looks like: using (SPSite site = new SPSite(url)) using (SPWeb web = site.OpenWeb()) {      PublishingWeb publishingWeb = PublishingWeb.GetPublishingWeb(web);     publishingWeb.ExcludeFromNavigation(true, itemID);     publishingWeb.Update(); } The 2010 update to the code looks like: using (SPSite site = new SPSite(url)) using (SPWeb web = site.OpenWeb()) {     PublishingWeb publishingWeb = PublishingWeb.GetPublishingWeb(web);     publishingWeb.Navigation.ExcludeFromNavigation(true, itemID); //--Had to reference the Navigation object.     publishingWeb.Update(); } The purpose of the code is to keep a page from showing up on the global or current navigation when it is added to the pages library. However you see that the update to the 2010 codebase actually makes more "object" sense. It specifies that you are technically affecting the Navigation of the Publishing Web, instead of the Publishing Web itself. I know that this isn't a difficult problem to fix, but I thought it would be something quick to help out the general public. Michael

    Read the article

  • Thoughts about MVC

    - by ayyash
    so i figured this one out, as a newcomer to the web development scene from the telecom biz, where we dealt with low level hardware API's, and a C++ fanboy, i tend to be bothered by automagical code, yes i appreciate the effort that went into it, and i certainly appreciate the luxury it provides to get more things done, but i just don't get it. so i decided to change that, and start investigating the new MVC based web apps, and at first it was like hitting a brick wall, i knew MVC from MFC days, so i'm familiar with the pattern, but i just couldn't get my head around the web version of it, till i came to realize the routing, is actually a separate feature to be inspected, much like understanding how LINQ works by better understanding anonymous objects. and so this article serve as an introduction to the following blogs where i share my views of how asp.net routing works, and then leverage that to the MVC level, and play around that field for a bit. as with most of my shared knowladge that may seem trivial to some, but i guess a newcomer's point of view can be useful for some folks out there.

    Read the article

  • Marshalling the value of a char* ANSI string DLL API parameter into a C# string

    - by Brian Biales
    For those who do not mix .NET C# code with legacy DLL's that use char* pointers on a regular basis, the process to convert the strings one way or the other is non-obvious. This is not a comprehensive article on the topic at all, but rather an example of something that took me some time to go find, maybe it will save someone else the time. I am utilizing a third party too that uses a call back function to inform my application of its progress.  This callback includes a pointer that under some circumstances is a pointer to an ANSI character string.  I just need to marshal it into a C# string variable.  Seems pretty simple, yes?  Well, it is, (as are most things, once you know how to do them). The parameter of my callback function is of type IntPtr, which implies it is an integer representation of a pointer.  If I know the pointer is pointing to a simple ANSI string, here is a simple static method to copy it to a C# string: private static string GetStringFromCharStar(IntPtr ptr) {     return System.Runtime.InteropServices.Marshal.PtrToStringAnsi(ptr); } The System.Runtime.InteropServices is where to look any time you are mixing legacy unmanaged code with your .NET application.

    Read the article

  • MVC & Windows Authentication

    - by TJ
    Changed the ValidateUser function  public bool ValidateUser(string userName, string password)         {             bool validation;             try             {                 LdapConnection ldc = new LdapConnection(new LdapDirectoryIdentifier((string)null, false, false));                 NetworkCredential nc = new NetworkCredential(userName, password, "domainname.com");                 ldc.Credential = nc;                 ldc.AuthType = AuthType.Negotiate;                 ldc.Bind(nc); // user has authenticated at this point, as the credentials were used to login to the dc.                  string myvar = ldc.SessionOptions.DomainName;                 validation = true;             }             catch (LdapException)             {                 validation = false;             }             return validation;         }

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >