Search Results

Search found 2044 results on 82 pages for 'rob west'.

Page 70/82 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • USB flash drive showing empty but half of the capacity is in Used

    - by tamakisquare
    Not sure if I should post my question in superuser, but it looks like the most appropriate place among all StackExchange sites. I have a 16GB Kingston DataTraveler USB drive. When I tried to use it this morning, it showed up nothing in there but yet its details showed that half of the capacity was in used. I tried it with OS X, Ubuntu, and Windows 7 and the results were the same. I tried to create a new folder and it worked. Apparently, the drive is working but somehow not showing my previously stored data. Note that I was still using the drive last night and there wasn't any problems. Following @rob's suggestion, du -h gave me: 16K ./.Trashes 960K ./.Spotlight-V100/Store-V1/Stores/2620683B-A38B-42F4-A247-45CAF4826ADE 976K ./.Spotlight-V100/Store-V1/Stores 1008K ./.Spotlight-V100/Store-V1 1.0M ./.Spotlight-V100 1.1M And, df -h gave me: /dev/sdb1 15G 7.9G 7.1G 53% /media/KINGSTON Confirming what I reported. Anyone got a clue/answer to this issue? Thanks.

    Read the article

  • The model to sell apps on App Store is better with a paid only version?

    - by ????
    Rob Napier, the author of iOS 5 Programming Pushing the Limits, mentioned there are several models of selling apps on the App Store: Write an app and sell it Publish a free and a full version Ad supported by third party or by iAd In App purchase Surprisingly, the author said that the most workable model is (1) in terms of sales. I would think that (2) with fairly limiting ability for the free version can bring more sales, as people without trying, might not plunge down $0.99 or $1.99 for something they haven't tried? I for one, might not have purchased Angry Birds if I didn't try their free version first. Also, I think it also depends on the situation: if the app is about alarm clock, and there are already 5 alarm clocks in App Store that are free, then your app that is $0.99 might not be that eagerly purchased. If yours is also free, and users really like it out of all the other ones, then they may think, $0.99 is nothing to get a good alarm clock, and gladly pay you the $0.99 in exchange for a full version of the alarm clock, something that they can't get with the free version. (such as the full version can let you choose a song from your Music Library for the alarm). Could (1) work only if the user definitely want it and have no substitute? How might it work the best?

    Read the article

  • Swing button repaint issue

    - by KáGé
    Hello, I'm new to java and I have to get a school project done by Sunday and got a problem. Here's the code: private abstract class GamePanel { JPanel panel = null; } private class PutPanel extends GamePanel { JButton putShip1 = new JButton(""); JButton putShip2 = new JButton(""); JButton putShip3 = new JButton(""); JButton putShip4 = new JButton(""); ShipDirection ship1Direction = ShipDirection.NORTH; ShipDirection ship2Direction = ShipDirection.NORTH; ShipDirection ship3Direction = ShipDirection.NORTH; ShipDirection ship4Direction = ShipDirection.NORTH; JButton startButton = new JButton("Start game"); public PutPanel(){ this.panel = new JPanel(); panel.setSize(200, Torpedo.session.map.size*Field.buttonSize+300); panel.setBackground(Color.blue); putShip1.setSize(90, 90); putShip1.setLocation(55, 5); putShip1.setIcon(createImageIcon(Torpedo.session.map.shipPath+"/ship1/full_north.png", null)); putShip2.setSize(90, 90); putShip2.setLocation(55, 105); putShip2.setIcon(createImageIcon(Torpedo.session.map.shipPath+"/ship2/full_north.png", null)); putShip3.setSize(90, 90); putShip3.setLocation(55, 205); putShip3.setIcon(createImageIcon(Torpedo.session.map.shipPath+"/ship3/full_north.png", null)); putShip4.setSize(90, 90); putShip4.setLocation(55, 305); putShip4.setIcon(createImageIcon(Torpedo.session.map.shipPath+"/ship4/full_north.png", null)); startButton.setSize(150, 30); startButton.setLocation(20, Torpedo.session.map.size*Field.buttonSize+205); panel.add(putShip1); panel.add(putShip2); panel.add(putShip3); panel.add(putShip4); panel.add(startButton); startButton.addActionListener(startButton()); startButton.addActionListener(putShip1()); startButton.addActionListener(putShip2()); startButton.addActionListener(putShip3()); startButton.addActionListener(putShip4()); panel.setLayout(null); panel.setVisible(true); } private ActionListener startButton(){ return new ActionListener(){ public void actionPerformed(ActionEvent e) { putPanel.panel.setVisible(false); actionPanel.panel.setVisible(true); } }; } private ActionListener putShip1(){ return new ActionListener(){ public void actionPerformed(ActionEvent e) { selectedShip = 1; putShip1.setBackground(Color.red); putShip2.setBackground(null); putShip3.setBackground(null); putShip4.setBackground(null); switch(ship1Direction){ case NORTH: ship1Direction = ShipDirection.EAST; putShip1.setIcon(createImageIcon(Torpedo.session.map.shipPath+"/ship1/full_east.png", null)); break; case EAST: ship1Direction = ShipDirection.SOUTH; putShip1.setIcon(createImageIcon(Torpedo.session.map.shipPath+"/ship1/full_south.png", null)); break; case SOUTH: ship1Direction = ShipDirection.WEST; putShip1.setIcon(createImageIcon(Torpedo.session.map.shipPath+"/ship1/full_west.png", null)); break; case WEST: ship1Direction = ShipDirection.NORTH; putShip1.setIcon(createImageIcon(Torpedo.session.map.shipPath+"/ship1/full_north.png", null)); break; } putShip1.repaint(); putShip2.repaint(); putShip3.repaint(); putShip4.repaint(); panel.repaint(); JOptionPane.showMessageDialog(new JFrame(), "Repaint finished", "Repaint status info", JOptionPane.INFORMATION_MESSAGE); //this here hangs the program when the method is finally called } }; When one of the putShip* buttons is clicked, it should rotate its own icon right 90° (means changing it to the next image), but it does nothing until the startButton is clicked, which changes the panel to an other one. (Only the first button's actionListener is here, the others' are practically the same). The panel is in a JFrame with two other panels that yet do nothing. How could I make it work properly? Thank you.

    Read the article

  • Using jQuery to Insert a New Database Record

    - by Stephen Walther
    The goal of this blog entry is to explore the easiest way of inserting a new record into a database using jQuery and .NET. I’m going to explore two approaches: using Generic Handlers and using a WCF service (In a future blog entry I’ll take a look at OData and WCF Data Services). Create the ASP.NET Project I’ll start by creating a new empty ASP.NET application with Visual Studio 2010. Select the menu option File, New Project and select the ASP.NET Empty Web Application project template. Setup the Database and Data Model I’ll use my standard MoviesDB.mdf movies database. This database contains one table named Movies that looks like this: I’ll use the ADO.NET Entity Framework to represent my database data: Select the menu option Project, Add New Item and select the ADO.NET Entity Data Model project item. Name the data model MoviesDB.edmx and click the Add button. In the Choose Model Contents step, select Generate from database and click the Next button. In the Choose Your Data Connection step, leave all of the defaults and click the Next button. In the Choose Your Data Objects step, select the Movies table and click the Finish button. Unfortunately, Visual Studio 2010 cannot spell movie correctly :) You need to click on Movy and change the name of the class to Movie. In the Properties window, change the Entity Set Name to Movies. Using a Generic Handler In this section, we’ll use jQuery with an ASP.NET generic handler to insert a new record into the database. A generic handler is similar to an ASP.NET page, but it does not have any of the overhead. It consists of one method named ProcessRequest(). Select the menu option Project, Add New Item and select the Generic Handler project item. Name your new generic handler InsertMovie.ashx and click the Add button. Modify your handler so it looks like Listing 1: Listing 1 – InsertMovie.ashx using System.Web; namespace WebApplication1 { /// <summary> /// Inserts a new movie into the database /// </summary> public class InsertMovie : IHttpHandler { private MoviesDBEntities _dataContext = new MoviesDBEntities(); public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/plain"; // Extract form fields var title = context.Request["title"]; var director = context.Request["director"]; // Create movie to insert var movieToInsert = new Movie { Title = title, Director = director }; // Save new movie to DB _dataContext.AddToMovies(movieToInsert); _dataContext.SaveChanges(); // Return success context.Response.Write("success"); } public bool IsReusable { get { return true; } } } } In Listing 1, the ProcessRequest() method is used to retrieve a title and director from form parameters. Next, a new Movie is created with the form values. Finally, the new movie is saved to the database and the string “success” is returned. Using jQuery with the Generic Handler We can call the InsertMovie.ashx generic handler from jQuery by using the standard jQuery post() method. The following HTML page illustrates how you can retrieve form field values and post the values to the generic handler: Listing 2 – Default.htm <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>Add Movie</title> <script src="http://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.js" type="text/javascript"></script> </head> <body> <form> <label>Title:</label> <input name="title" /> <br /> <label>Director:</label> <input name="director" /> </form> <button id="btnAdd">Add Movie</button> <script type="text/javascript"> $("#btnAdd").click(function () { $.post("InsertMovie.ashx", $("form").serialize(), insertCallback); }); function insertCallback(result) { if (result == "success") { alert("Movie added!"); } else { alert("Could not add movie!"); } } </script> </body> </html>     When you open the page in Listing 2 in a web browser, you get a simple HTML form: Notice that the page in Listing 2 includes the jQuery library. The jQuery library is included with the following SCRIPT tag: <script src="http://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.js" type="text/javascript"></script> The jQuery library is included on the Microsoft Ajax CDN so you can always easily include the jQuery library in your applications. You can learn more about the CDN at this website: http://www.asp.net/ajaxLibrary/cdn.ashx When you click the Add Movie button, the jQuery post() method is called to post the form data to the InsertMovie.ashx generic handler. Notice that the form values are serialized into a URL encoded string by calling the jQuery serialize() method. The serialize() method uses the name attribute of form fields and not the id attribute. Notes on this Approach This is a very low-level approach to interacting with .NET through jQuery – but it is simple and it works! And, you don’t need to use any JavaScript libraries in addition to the jQuery library to use this approach. The signature for the jQuery post() callback method looks like this: callback(data, textStatus, XmlHttpRequest) The second parameter, textStatus, returns the HTTP status code from the server. I tried returning different status codes from the generic handler with an eye towards implementing server validation by returning a status code such as 400 Bad Request when validation fails (see http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html ). I finally figured out that the callback is not invoked when the textStatus has any value other than “success”. Using a WCF Service As an alternative to posting to a generic handler, you can create a WCF service. You create a new WCF service by selecting the menu option Project, Add New Item and selecting the Ajax-enabled WCF Service project item. Name your WCF service InsertMovie.svc and click the Add button. Modify the WCF service so that it looks like Listing 3: Listing 3 – InsertMovie.svc using System.ServiceModel; using System.ServiceModel.Activation; namespace WebApplication1 { [ServiceBehavior(IncludeExceptionDetailInFaults=true)] [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class MovieService { private MoviesDBEntities _dataContext = new MoviesDBEntities(); [OperationContract] public bool Insert(string title, string director) { // Create movie to insert var movieToInsert = new Movie { Title = title, Director = director }; // Save new movie to DB _dataContext.AddToMovies(movieToInsert); _dataContext.SaveChanges(); // Return movie (with primary key) return true; } } }   The WCF service in Listing 3 uses the Entity Framework to insert a record into the Movies database table. The service always returns the value true. Notice that the service in Listing 3 includes the following attribute: [ServiceBehavior(IncludeExceptionDetailInFaults=true)] You need to include this attribute if you want to get detailed error information back to the client. When you are building an application, you should always include this attribute. When you are ready to release your application, you should remove this attribute for security reasons. Using jQuery with the WCF Service Calling a WCF service from jQuery requires a little more work than calling a generic handler from jQuery. Here are some good blog posts on some of the issues with using jQuery with WCF: http://encosia.com/2008/06/05/3-mistakes-to-avoid-when-using-jquery-with-aspnet-ajax/ http://encosia.com/2008/03/27/using-jquery-to-consume-aspnet-json-web-services/ http://weblogs.asp.net/scottgu/archive/2007/04/04/json-hijacking-and-how-asp-net-ajax-1-0-mitigates-these-attacks.aspx http://www.west-wind.com/Weblog/posts/896411.aspx http://www.west-wind.com/weblog/posts/324917.aspx http://professionalaspnet.com/archive/tags/WCF/default.aspx The primary requirement when calling WCF from jQuery is that the request use JSON: The request must include a content-type:application/json header. Any parameters included with the request must be JSON encoded. Unfortunately, jQuery does not include a method for serializing JSON (Although, oddly, jQuery does include a parseJSON() method for deserializing JSON). Therefore, we need to use an additional library to handle the JSON serialization. The page in Listing 4 illustrates how you can call a WCF service from jQuery. Listing 4 – Default2.aspx <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>Add Movie</title> <script src="http://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.js" type="text/javascript"></script> <script src="Scripts/json2.js" type="text/javascript"></script> </head> <body> <form> <label>Title:</label> <input id="title" /> <br /> <label>Director:</label> <input id="director" /> </form> <button id="btnAdd">Add Movie</button> <script type="text/javascript"> $("#btnAdd").click(function () { // Convert the form into an object var data = { title: $("#title").val(), director: $("#director").val() }; // JSONify the data data = JSON.stringify(data); // Post it $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", url: "MovieService.svc/Insert", data: data, dataType: "json", success: insertCallback }); }); function insertCallback(result) { // unwrap result result = result["d"]; if (result === true) { alert("Movie added!"); } else { alert("Could not add movie!"); } } </script> </body> </html> There are several things to notice about Listing 4. First, notice that the page includes both the jQuery library and Douglas Crockford’s JSON2 library: <script src="Scripts/json2.js" type="text/javascript"></script> You need to include the JSON2 library to serialize the form values into JSON. You can download the JSON2 library from the following location: http://www.json.org/js.html When you click the button to submit the form, the form data is converted into a JavaScript object: // Convert the form into an object var data = { title: $("#title").val(), director: $("#director").val() }; Next, the data is serialized into JSON using the JSON2 library: // JSONify the data var data = JSON.stringify(data); Finally, the form data is posted to the WCF service by calling the jQuery ajax() method: // Post it $.ajax({   type: "POST",   contentType: "application/json; charset=utf-8",   url: "MovieService.svc/Insert",   data: data,   dataType: "json",   success: insertCallback }); You can’t use the standard jQuery post() method because you must set the content-type of the request to be application/json. Otherwise, the WCF service will reject the request for security reasons. For details, see the Scott Guthrie blog post: http://weblogs.asp.net/scottgu/archive/2007/04/04/json-hijacking-and-how-asp-net-ajax-1-0-mitigates-these-attacks.aspx The insertCallback() method is called when the WCF service returns a response. This method looks like this: function insertCallback(result) {   // unwrap result   result = result["d"];   if (result === true) {       alert("Movie added!");   } else {     alert("Could not add movie!");   } } When we called the jQuery ajax() method, we set the dataType to JSON. That causes the jQuery ajax() method to deserialize the response from the WCF service from JSON into a JavaScript object automatically. The following value is passed to the insertCallback method: {"d":true} For security reasons, a WCF service always returns a response with a “d” wrapper. The following line of code removes the “d” wrapper: // unwrap result result = result["d"]; To learn more about the “d” wrapper, I recommend that you read the following blog posts: http://encosia.com/2009/02/10/a-breaking-change-between-versions-of-aspnet-ajax/ http://encosia.com/2009/06/29/never-worry-about-asp-net-ajaxs-d-again/ Summary In this blog entry, I explored two methods of inserting a database record using jQuery and .NET. First, we created a generic handler and called the handler from jQuery. This is a very low-level approach. However, it is a simple approach that works. Next, we looked at how you can call a WCF service using jQuery. This approach required a little more work because you need to serialize objects into JSON. We used the JSON2 library to perform the serialization. In the next blog post, I want to explore how you can use jQuery with OData and WCF Data Services.

    Read the article

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    "Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack. Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while. Self-Service BI Self-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI. This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me: PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.) Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.) One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.) Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.) Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.) This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users. It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations. Collaborative BI I have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time. Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people." The Microsoft BI Stack in General A question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years. Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?" Expo Hall I had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here. Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions. Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind! Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Laissez les bon temps rouler! (Microsoft BI Conference 2010)

    - by smisner
    Laissez les bons temps rouler" is a Cajun phrase that I heard frequently when I lived in New Orleans in the mid-1990s. It means "Let the good times roll!" and encapsulates a feeling of happy expectation. As I met with many of my peers and new acquaintances at the Microsoft BI Conference last week, this phrase kept running through my mind as people spoke about their plans in their respective businesses, the benefits and opportunities that the recent releases in the BI stack are providing, and their expectations about the future of the BI stack.Notwithstanding some jabs here and there to point out the platform is neither perfect now nor will be anytime soon (along with admissions that the competitors are also not perfect), and notwithstanding several missteps by the event organizers (which I don't care to enumerate), the overarching mood at the conference was positive. It was a refreshing change from the doom and gloom hovering over several conferences that I attended in 2009. Although many people expect economic hardships to continue over the coming year or so, everyone I know in the BI field is busier than ever and expects to stay busy for quite a while.Self-Service BISelf-service was definitely a theme of the BI conference. In the keynote, Ted Kummert opened with a look back to a fairy tale vision of self-service BI that he told in 2008. At that time, the fairy tale future was a time when "every end user was able to use BI technologies within their job in order to move forward more effectively" and transitioned to the present time in which SQL Server 2008 R2, Office 2010, and SharePoint 2010 are available to deliver managed self-service BI.This set of technologies is presumably poised to address the needs of the 80% of users that Kummert said do not use BI today. He proceeded to outline a series of activities that users ought to be able to do themselves--from simple changes to a report like formatting or an addtional data visualization to integration of an additional data source. The keynote then continued with a series of demonstrations of both current and future technology in support of self-service BI. Some highlights that interested me:PowerPivot, of course, is the flagship product for self-service BI in the Microsoft BI stack. In the TechEd keynote, which was open to the BI conference attendees, Amir Netz (twitter) impressed the audience by demonstrating interactivity with a workbook containing 100 million rows. He upped the ante at the BI keynote with his demonstration of a future-state PowerPivot workbook containing over 2 billion records. It's important to note that this volume of data is being processed by a server engine, and not in the PowerPivot client engine. (Yes, I think it's impressive, but none of my clients are typically wrangling with 2 billion records at a time. Maybe they're thinking too small. This ability to work quickly with large data sets has greater implications for BI solutions than for self-service BI, in my opinion.)Amir also demonstrated KPIs for the future PowerPivot, which appeared to be easier to implement than in any other Microsoft product that supports KPIs, apart from simple KPIs in SharePoint. (My initial reaction is that we have one more place to build KPIs. Great. It's confusing enough. I haven't seen how well those KPIs integrate with other BI tools, which will be important for adoption.)One more PowerPivot feature that Amir showed was a graphical display of the lineage for calculations. (This is hugely practical, especially if you build up calculations incrementally. You can more easily follow the logic from calculation to calculation. Furthermore, if you need to make a change to one calculation, you can assess the impact on other calculations.)Another product demonstration will be available within the next 30 days--Pivot for Reporting Services. If you haven't seen this technology yet, check it out at www.getpivot.com. (It definitely has a wow factor, but I'm skeptical about its practicality. However, I'm looking forward to trying it out with data that I understand.)Michael Tejedor (twitter) demonstrated a feature that I think is really interesting and not emphasized nearly enough--overshadowed by PowerPivot, no doubt. That feature is the Microsoft Business Intelligence Indexing Connector, which enables search of the content of Excel workbooks and Reporting Services reports. (This capability existed in MOSS 2007, but was more cumbersome to implement. The search results in SharePoint 2010 are not only cooler, but more useful by describing whether the content is found in a table or a chart, for example.)This may yet be the dawning of the age of self-service BI - a phrase I've heard repeated from time to time over the last decade - but I think BI professionals are likely to stay busy for a long while, and need not start looking for a new line of work. Kummert repeatedly referenced strategic BI solutions in contrast to self-service BI to emphasize that self-service BI is not a replacement for the services that BI professionals provide. After all, self-service BI does not appear magically on user desktops (or whatever device they want to use). A supporting infrastructure is necessary, and grows in complexity in proportion to the need to simplify BI for users.It's one thing to hear the party line touted by Microsoft employees at the BI keynote, but it's another to hear from the people who are responsible for implementing and supporting it within an organization. Rob Collie (blog | twitter), Kasper de Jonge (blog | twitter), Vidas Matelis (site | twitter), and I were invited to join Andrew Brust (blog | twitter) as he led a Birds of a Feather session at TechEd entitled "PowerPivot: Is It the BI Deal-Changer for Developers and IT Pros?" I would single out the prevailing concern in this session as the issue of control. On one side of this issue were those who were concerned that they would lose control once PowerPivot is implemented. On the other side were those who believed that data should be freely accessible to users in PowerPivot, and even acknowledgment that users would get the data they want even if it meant they would have to manually enter into a workbook to have it ready for analysis. For another viewpoint on how PowerPivot played out at the conference, see Rob Collie's observations.Collaborative BII have been intrigued by the notion of collaborative BI for a very long time. Before I discovered BI, I was a Lotus Notes developer and later a manager of developers, working in a software company that enabled collaboration in the legal industry. Not only did I help create collaborative systems for our clients, I created a complete project management from the ground up to collaboratively manage our custom development work. In that case, collaboration involved my team, my client contacts, and me. I was also able to produce my own BI from that system as well, but didn't know that's what I was doing at the time. Only in recent years has SharePoint begun to catch up with the capabilities that I had with Lotus Notes more than a decade ago. Eventually, I had the opportunity at that job to formally investigate BI as another product offering for our software, and the rest - as they say - is history. I built my first data warehouse with Scott Cameron (who has also ventured into the authoring world by writing Analysis Services 2008 Step by Step and was at the BI Conference last week where I got to reminisce with him for a bit) and that began a career that I never imagined at the time.Fast forward to 2010, and I'm still lauding the virtues of collaborative BI, if only the tools will catch up to my vision! Thus, I was anxious to see what Donald Farmer (blog | twitter) and Rita Sallam of Gartner had to say on the subject in their session "Collaborative Decision Making." As I suspected, the tools aren't quite there yet, but the vendors are moving in the right direction. One thing I liked about this session was a non-Microsoft perspective of the state of the industry with regard to collaborative BI. In addition, this session included a better demonstration of SharePoint collaborative BI capabilities than appeared in the BI keynote. Check out the video in the link to the session to see the demonstration. One of the use cases that was demonstrated was linking from information to a person, because, as Donald put it, "People don't trust data, they trust people."The Microsoft BI Stack in GeneralA question I hear all the time from students when I'm teaching is how to know what tools to use when there is overlap between products in the BI stack. I've never taken the time to codify my thoughts on the subject, but saw that my friend Dan Bulos provided good insight on this topic from a variety of perspectives in his session, "So Many BI Tools, So Little Time." I thought one of his best points was that ideally you should be able to design in your tool of choice, and then deploy to your tool of choice. Unfortunately, the ideal is yet to become real across the platform. The closest we come is with the RDL in Reporting Services which can be produced from two different tools (Report Builder or Business Intelligence Development Studio's Report Designer), manually, or by a third-party or custom application. I have touted the idea for years (and publicly said so about 5 years ago) that eventually more products would be RDL producers or consumers, but we aren't there yet. Maybe in another 5 years.Another interesting session that covered the BI stack against a backdrop of competitive products was delivered by Andrew Brust. Andrew did a marvelous job of consolidating a lot of information in a way that clearly communicated how various vendors' offerings compared to the Microsoft BI stack. He also made a particularly compelling argument about how the existence of an ecosystem around the Microsoft BI stack provided innovation and opportunities lacking for other vendors. Check out his presentation, "How Does the Microsoft BI Stack...Stack Up?"Expo HallI had planned to spend more time in the Expo Hall to see who was doing new things with the BI stack, but didn't manage to get very far. Each time I set out on an exploratory mission, I got caught up in some fascinating conversations with one or more of my peers. I find interacting with people that I meet at conferences just as important as attending sessions to learn something new. There were a couple of items that really caught me eye, however, that I'll share here.Pragmatic Works. Whether you develop SSIS packages, build SSAS cubes, or author SSRS reports (or all of the above), you really must take a look at BI Documenter. Brian Knight (twitter) walked me through the key features, and I must say I was impressed. Once you've seen what this product can do, you won't want to document your BI projects any other way. You can download a free single-user database edition, or choose from more feature-rich standard or professional editions.Microsoft Press ebooks. I also stopped by the O'Reilly Media booth to meet some folks that one of my acquisitions editors at Microsoft Press recommended. In case you haven't heard, Microsoft Press has partnered with O'Reilly Media for distribution and publishing. Apart from my interest in learning more about O'Reilly Media as an author, an advertisement in their booth caught me eye which I think is a really great move. When you buy Microsoft Press ebooks through the O'Reilly web site, you can receive it in any (or all) of the following formats where possible: PDF, epub, .mobi for Kindle and .apk for Android. You also have lifetime DRM-free access to the ebooks. As someone who is an avid collector of books, I fnd myself running out of room for storage. In addition, I travel a lot, and it's hard to lug my reference library with me. Today's e-reader options make the move to digital books a more viable way to grow my library. Having a variety of formats means I am not limited to a single device, and lifetime access means I don't have to worry about keeping track of where I've stored my files. Because the e-books are DRM-free, I can copy and paste when I'm compiling notes, and I can print pages when necessary. That's a winning combination in my mind!Overall, I was pleased with the BI conference. There were many more sessions that I couldn't attend, either because the room was full when I got there or there were multiple sessions running concurrently that I wanted to see. Fortunately, many of the sessions are accessible for viewing online at http://www.msteched.com/2010/NorthAmerica along with the TechEd sessions. You can spot the BI sessions by the yellow skyline on the title slide of the presentation as shown below. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • 405 Method Not Allowed Error in WCF

    - by DotnetDude
    Can someone spot the problem with this implementation? I can open it up in the browser and it works, but a call from client side (using both jquery and asp.net ajax fails) Service Contract [OperationContract(Name = "GetTestString")] [WebInvoke(Method = "GET", ResponseFormat = WebMessageFormat.Json )] string GetTestString(); In Web.config among other bindings, I have a webHttp binding <endpoint address="ajax" binding="webHttpBinding" contract="TestService" behaviorConfiguration="AjaxBehavior" /> EndPoint Behavior <endpointBehaviors> <behavior name="AjaxBehavior"> <enableWebScript/> </behavior> </endpointBehaviors> </behaviors> Svc file <%@ ServiceHost Service="TestService" %> Client var serviceUrl = "http://127.0.0.1/Test.svc/ajax/"; var proxy = new ServiceProxy(serviceUrl); I am then using the approach in http://www.west-wind.com/weblog/posts/324917.aspx to call the service

    Read the article

  • How do I share a WiX fragment in two WiX projects?

    - by Randy Eppinger
    We have a WiX fragment in a file SomeDialog.wxs that prompts the user for some information. It's referenced in another fragment in InstallerUI.wxs file that controls the dialog order. Of course, Product.wxs is our main file. Works great. Now I have a second Visual Studio 2008 Wix 3.0 Project for the .MSI of another application and it needs to ask the user for the same information. I can't seem to figure out the best way to share the file so that changing the information requested will result in both .MSIs getting the new behavior. I honestly can't tell if a merge module, an .wsi (include) or a .wixlib is the right solution. I would have hoped to find a simple example of someone doing this but I have failed thus far. Edit: Based on Rob Mensching's wixlib blog entry, a wixlib may be the answer, but I am still searching for an example of how to do this.

    Read the article

  • Can't figure out jQuery ajax call parameters

    - by chad larson
    I am learning jQuery and trying the following but the parameters are so foreign to me with all the embedded quotes I think that is my problem. Can someone explain the parameters and where quotes go and possibly rewrite my parameters line? (This is a live site to see the required parms). function AirportInfo() { var divToBeWorkedOn = '#detail'; var webMethod = "'http://ws.geonames.org/citiesJSON'"; var parameters = "{'north':'44.1','south':'9.9','east':'22.4','west':'55.2','lang':'de'}"; $.ajax({ type: "POST", url: webMethod, data: parameters, contentType: "application/json; charset=utf-8", dataType: "json", success: function(msg) { alert(msg.d); $(divToBeWorkedOn).html(msg.d); }, error: function(xhr) { alert(xhr); alert(xhr.statusText); alert(xhr.responseText); $(divToBeWorkedOn).html("Unavailable"); } }); }

    Read the article

  • How do I get the intellisense in FoxPro 8 to work with .net COM objects?

    - by IsaacB
    Hi, I'm at my wit's end with this. What I'm doing is making a C# dll file that needs to have some methods exposed to FoxPro 8. This guy here http://www.west-wind.com/presentations/VfpDotNetInterop/DotNetFromVFP.asp says that you can put [ClassInterface(ClassInterfaceType.AutoDual)] in front of the (C# in my case) class, and then intellisense in Foxpro magically works. I'm accessing the COM object fine in FoxPro, but unfortunately the intellisense doesn't work, and it's annoying me. Is there some other step I'm missing to this? Is there some registry entry to look for to confirm that the methods are exposed properly (for intellisense to work)? Are there other steps in Foxpro that I'm supposed to follow (I don't know a thing about FoxPro!) It might be a pretty obscure question these days, but someone on here must know the answer! Thanks

    Read the article

  • linq query - The method or operation is not implemented.

    - by Dejan.S
    I'm doing the Rob Conery mvc storefront and at one place I'm suppose to get categories but I get an error with this linq query and I can not get why. It's exactly the same as his. var culturedName = from ct in ReadOnlyContext.CategoryCultureDetails where ct.Culture.LanguageCode == System.Globalization.CultureInfo.CurrentUICulture.TwoLetterISOLanguageName select new { ct.CategoryName, ct.CategoryId }; return from c in ReadOnlyContext.Categories join cn in culturedName on c.CategoryId equals cn.CategoryId select new Core.Model.Category { Id = c.CategoryId, Name = cn.CategoryName, ParentId = c.ParentId ?? 0, Products = new LazyList<Core.Model.Product>(from p in GetProducts() join cp in ReadOnlyContext.Categories_Products on p.Id equals cp.ProductId where cp.CategoryId == c.CategoryId select p) }; its the return query that mess things up. I have checked and the culturename actually gets data from the database. Appricate your help

    Read the article

  • How to solve JAVA menubar, layout and panel problem ?

    - by Berkay
    i'm not a java guy however i start implementing some security tools with java, in these days i'm creating a gui for my security tools : Here is the how my Gui looks: menuBar = new JMenuBar(); // construct menu bar // hash functions fileMenu = new JMenu("HASH FUNCTIONS"); // define file menu fileMenu.setMnemonic('H'); // shortcut hashes = new JMenuItem("Md5&Sha1"); // define file menu options hashes.setMnemonic('M'); hashes.addActionListener(this); fileMenu.add(hashes); menuBar.add(fileMenu); // add file menu to menu bar // symmetric encryption asMenu = new JMenu("SYMMETRIC ENCRYPTION"); // define format menu asMenu.setMnemonic('S'); // shortcut desItem = new JMenuItem("DES"); // define format menu options desItem.setMnemonic('D'); desItem.addActionListener(this); asMenu.add(desItem); ... ... menuBar.add(helpMenu); // add help menu to menu bar setJMenuBar(menuBar); // put menu bar on application textColor = Color.RED; when from the Menu desitem is selected, desvar is just for not showing the panel multiple times, it calls Panels () else if(e.getSource() == desItem && desvar ==1 ) { // make other panels unvisible. if(hashvar!=1) MyPanel.setVisible(false);//hash functions if(rsavar!=1) MyPanel3.setVisible(false);//rsa function if (dhvar!=1) MyPanel4.setVisible(false);//dh diffie hellman hashvar=1; rsavar=1; dhvar=1; ++desvar; desPanel=true; Panels(); } and in Panels() Method: if (hashPanel){ MyPanel.add("West",radioSHA1); MyPanel.add("West",radioMD5); MyPanel.add("Center", inputField); MyPanel.add("East",SubmitButton); MyPanel.add("South",resultHash); add(MyPanel); validate(); hashPanel=false; } i have many panels for example : -hash functions=mypanel1 , des=mypanel2, rsa functions= mypanel3 , dh= mypanel4 However in may other panals such as rsa function: i have to use some layout properties of the java:in this panel i selected to use gridbaglayout if (dhPanel){ System.out.println("rsa panel burda misin"); MyPanel3.add(pqLabel); MyPanel3.add(pLabel); MyPanel3.add(pTextArea); MyPanel3.add(qLabel); ... ... add(MyPanel3); generate_pqButton.addActionListener(this); calculate_nButton.addActionListener(this); ... ... GridBagLayout layout = new GridBagLayout(); GridBagConstraints gbc = new GridBagConstraints(); setLayout(layout); // x, y, w, h, wx, wy gbc.fill = GridBagConstraints.NONE; add (bit_length_label, gbc, 0, 0, 1, 1, 0, 10); add (p, gbc, 0, 1, 1, 1, 0, 10); add (g, gbc, 0, 2, 1, 1, 0, 10); add (a, gbc, 1, 3, 1, 1, 100, 10); add (x, gbc, 0, 4, 1, 1, 0, 10); add (gx, gbc, 0, 5, 1, 1, 0, 10); add (gxy, gbc, 0, 6, 1, 1, 0, 10); add (b, gbc, 1, 7, 1, 1, 100, 10); add (y, gbc, 0, 8, 1, 1, 0, 10); add (gy, gbc, 0, 9, 1, 1, 0, 10); add (gyx, gbc, 0, 10, 1, 1, 0, 10); add (sk, gbc, 1, 11, 1, 1, 100, 10); add (key, gbc, 0, 12, 1, 1, 0, 10); add (status, gbc, 1, 13, 1, 1, 100, 10); add (dhstart, gbc, 1, 14, 1, 1, 100, 10); add (bit_length_value, gbc, 1, 0, 1, 1, 100, 10); add (p_value, gbc, 1, 1, 1, 1, 100, 10); add (g_value, gbc, 1, 2, 1, 1, 100, 10); add (x_value, gbc, 1, 4, 1, 1, 100, 10); add (gx_value, gbc, 1, 5, 1, 1, 100, 10); add (gxy_value, gbc, 1, 6, 1, 1, 100, 10); add (y_value, gbc, 1, 8, 1, 1, 100, 10); add (gy_value, gbc, 1, 9, 1, 1, 100, 10); add (gyx_value, gbc, 1, 10, 1, 1, 100, 10); validate(); repaint(); rsaPanel=false; } Everthing seems okey however when i swith from one menu to another sometimes components are seen or apper in wrong places or mixed. where i'm doing wrong?

    Read the article

  • Debugging asp.net with firefox and visual studio.net - very slow compared to IE

    - by john
    Debugging asp.net websites/web projects in visual studio.net 2005 with Firefox is loads slower than using IE. I've read something somewhere that there is a way of fixing this but i can't for the life of me find it again. Does anyone know what i'm on about and can point me in the right direction please? Cheers John edit sorry rob i haven't explained myself very well(again). I prefer Firefox for debugging (firebug etc) hitting F5 when debugging with IE the browser launches really quickly and clicking around my web application is almost instant and when a breakpont is hit i get to my code straight away with no delays. hitting F5 when debugging with FireFox the browser launches really slowly (ok i have plugins that slow FF loading) but clicking around my web application is really really slow and when a breakpoint is hit it takes ages to break into code. i swear i've read something somewhere that there is a setting in Firefox (about:config maybe?) that when changed to some magic setting sorts all this out.

    Read the article

  • Convert IEnumerable<dynamic> to JsonArray

    - by Burt
    I am selecting an IEnumerable<dynamic> from the database using Rob Conery's Massive framework. The structure comes back in a flat format Poco C#. I need to transform the data and output it to a Json array (format show at bottom). I thought I could do the transform using linq (my unsuccessful effort is shown below): using System.Collections.Generic; using System.Json; using System.Linq; using System.ServiceModel.Web; .... IEnumerable<dynamic> list = _repository.All("", "", 0).ToList(); JsonArray returnValue = from item in list select new JsonObject() { Name = item.Test, Data = new dyamic(){...}... }; Here is the Json I am trying to generate: [ { "id": "1", "title": "Data Title", "data": [ { "column1 name": "the value", "column2 name": "the value", "column3 name": "", "column4 name": "the value" } ] }, { "id": "2", "title": "Data Title", "data": [ { "column1 name": "the value", "column2 name": "the value", "column3 name": "the value", "column4 name": "the value" } ] } ]

    Read the article

  • LinkButton + showmodaldialog not working

    - by CPM
    What would be the west way to open a window.showModalDialog on linkbutton when updating a form? I have a linkbutton on form that allows me to update the data, I want to be able to check if the data updatedin this case , parameter status of client is active I want to be able to open a window to fill some more information. Public Function OpenWindowRquest(ByVal URL As String) As String If accountMode = "1" Then Return "window.showModalDialog('" & Page.ResolveUrl(Server.UrlEncode(URL)) & "', window,'resizable:yes; scroll:on; status:yes; dialogWidth:750px; dialogHeight:350px; center:yes');" Else accountMode = "" Return "" End If On aspx side I have <asp:LinkButton id="UpdateButton" runat="server" commandName="Update" Text="Update" OnClientClick='<%# OpenWindowRequest("myurl.aspx") %>'></asp:LinkButton> I also tried to call the function OpenWindowRequest on FormUpdating event but it doesnt work, window is not opened.

    Read the article

  • Fortran recursion segmentation faults

    - by ConnorG
    Hey all - I have to design and implement a Fortran routine to determine the size of clusters on a square lattice, and it seemed extremely convenient to code the subroutine recursively. However, whenever my lattice size grows beyond a certain value (around 200/side), the subroutine consistently segfaults. Here's my cluster-detection routine: RECURSIVE SUBROUTINE growCluster(lattice, adj, idx, area) INTEGER, INTENT(INOUT) :: lattice(:), area INTEGER, INTENT(IN) :: adj(:,:), idx lattice(idx) = -1 area = area + 1 IF (lattice(adj(1,idx)).GT.0) & CALL growCluster(lattice,adj,adj(1,idx),area) IF (lattice(adj(2,idx)).GT.0) & CALL growCluster(lattice,adj,adj(2,idx),area) IF (lattice(adj(3,idx)).GT.0) & CALL growCluster(lattice,adj,adj(3,idx),area) IF (lattice(adj(4,idx)).GT.0) & CALL growCluster(lattice,adj,adj(4,idx),area) END SUBROUTINE growCluster where adj(1,n) represents the north neighbor of site n, adj(2,n) represents the west and so on. What would cause the erratic segfault behavior? Is the cluster just "too huge" for large lattice sizes?

    Read the article

  • Mutex names - best practice?

    - by Argalatyr
    Related to this question, what is the best practice for naming a mutex? I realize this may vary with OS and even with version (esp for Windows), so please specify platform in answering. My interest is in Win XP and Vista. EDIT: I am motivated by curiousity, because in Rob Kennedy's comment under his (excellent) Answer to the above-linked Question, he implied that the choice of mutex name is non-trivial and should be the subject of a separate question. EDIT2: The referenced question's goal was to ensure only a single instance of an app is running.

    Read the article

  • Left Join in Subsonic 3

    - by user303187
    I'm trying to do a left join in subsonic 3 using linq but it doesn't seem to work, I get a big error. var post = from p in Post.All() join q in Quote.All() on p.ID equals q.PostID into pq where p.ID == id.Value from qt in pq.DefaultIfEmpty() select new {p, qt}; I'm using subsonic 3, latest GIT version from Rob, but I'm getting an error, see below, when I try a left join. I have searched but I didn't found any solution. Can anyone explain to me why the error and how to fix it? Thanks Expression of type 'System.Collections.Generic.IEnumerable1[GetAQuote.Post]' cannot be used for parameter of type 'System.Linq.IQueryable1[GetAQuote.Post]' of method 'System.Linq.IQueryable1[<>f__AnonymousType221[GetAQuote.Post], System.Collections.Generic.IEnumerable1%5BGetAQuote.Quote%5D, System.Linq.Expressions.Expression1[System.Func2%5BGetAQuote.Post,System.Int32%5D%5D, System.Linq.Expressions.Expression1[System.Func2%5BGetAQuote.Quote,System.Int32%5D%5D, System.Linq.Expressions.Expression1[System.Func3%5BGetAQuote.Post,System.Collections.Generic.IEnumerable1[GetAQuote.Quote],<>f__AnonymousType22%5BGetAQuote.Post,System.Collections.Generic.IEnumerable1%5BGetAQuote.Quote%5D%5D%5D%5D"GetAQuote.Post,System.Collections.Generic.IEnumerable1[GetAQuote.Quote]]] GroupJoin[Post,Quote,Int32,<f__AnonymousType22'`

    Read the article

  • Groovy: How to access objects with Id tag?

    - by skifan0
    Hello, I have the following Groovy+SwingBuilder code. In one panel I generate checkboxes and in another panel I want to access the values of the checkboxes. The code looks basically likes this: def optionsmap = [ foo : "value_foo", bar : "value_bar"] SwingBuilder.build() { frame(title:'demo1', size:[400,700], visible:true, defaultCloseOperation:WC.EXIT_ON_CLOSE) { gridLayout(rows: 1, cols: 2) panel(id:'optionPanel', constraints:java.awt.BorderLayout.CENTER) { gridLayout(rows: 5, cols: 1) myGroup = buttonGroup(); for (entry in optionsmap) { checkBox(id: entry.key, text: entry.value ) } } panel(constraints:java.awt.BorderLayout.WEST) { button ('Show values', actionPerformed: { for (entry in optionsmap) { println (entry.key as Class).text } }) } } } optionsmap is a map with (id, text) pairs that can be extended. When I press "show values" I get an error message: org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object 'foo' with class 'java.lang.String' to class 'java.lang.Class' How could I access the checkboxes for my action in the second panel by using the checkbox ids from optionsmap? Thank you if you can help

    Read the article

  • linq to sql datacontext for a web application

    - by rap-uvic
    Hello, I'm trying to use linq to sql for my project (very short deadline), and I'm kind of in a bind. I don't know the best way to have a data context handy per request thread. I want something like a singleton class from which all my repository classes can access the current data context. However, singleton class is static and is not thread-safe and therefore not suitable for web apps. I want something that would create a data context at the beginning of the request and dispose of it along with the request. Can anyone please share their solution to this problem? I've been searching for a solution and I've found a good post from Rick Strahl : http://www.west-wind.com/weblog/posts/246222.aspx but I don't completely understand his thread-safe or business object approach. If somebody has a simplified version of his thread-safe approach, i'd love to take a look.

    Read the article

  • What are appropriate ways to represent relationships between people in a database table?

    - by Emilio
    I've got a table of people - an ID primary key and a name. In my application, people can have 0 or more real-world relationships with other people, so Jack might "work for" Jane and Tom might "replace" Tony and Bob might "be an employee of" Rob and Bob might also "be married to" Mary. What's the best way to represent this in the database? A many to many intersect table? A series of self joins? A relationship table with one row per relationship pair and type, where I insert records for the relationship in both directions?

    Read the article

  • Basic doubt about sensor usage

    - by Al
    Suppose I have a cellphone with accelerometer and magnetometer, and want to determine its absolute (wrt North/East/South/West) 3d position. Imagine the phone is laid vertically, with the screen facing me, the "up" vector pointing to the ceil. Whenever I tilt, the accelerometer allows me to get the "up" vector info change. The problem is that if I tilt the device and put it horizontally (screen now facing ceil, and "up" vector pointing to the opposite of where I am), then the up vector doesn't get updated any more if I rotate the phone horizontally on the table. This is something that clearly is detected by the magnetometer now. So, the question is, when to know where to use acc or mag for each case? Is there a generic way to achieve this?

    Read the article

  • Why doesn't this Perl array sort work?

    - by Luke
    Why won't the array sort? CODE my @data = ('PJ RER Apts to Share|PROVIDENCE', 'PJ RER Apts to Share|JOHNSTON', 'PJ RER Apts to Share|JOHNSTON', 'PJ RER Apts to Share|JOHNSTON', 'PJ RER Condo|WEST WARWICK', 'PJ RER Condo|WARWICK'); foreach my $line (@data) { $count = @data; chomp($line); @fields = split(/\|/,$line); if ($fields[0] eq "PJ RER Apts to Share"){ @city = "\u\L$fields[1]"; @city_sort = sort (@city); print "@city_sort","\n"; } } print "$count","\n"; OUTPUT Providence Johnston Johnston Johnston 6

    Read the article

  • C# Silverlight - XmlDictionary from Uri

    - by Robert White
    I've been developing a Silverlight application for a company's website and have encountered a problem. Up until now I have been programming this locally, now I need to publish the program onto the website; the issue is that FileStream can only access local files with elevated permissions. Here's a snippet of code: using (FileStream fileStream = new FileStream(@"E:\Users\LUPUS\Documents\Visual Studio 2010\Projects\Lycaon5\Lycaon5\acids.xdb", FileMode.Open)) { using (XmlDictionaryReader reader = XmlDictionaryReader.CreateTextReader(fileStream, XmlDictionaryReaderQuotas.Max)) { //Read the XML file out. } } Without changing anything to do with XmlDictionaryReader reader - How could I go about reading the files from a relative Uri? Many Thanks, Rob. P.s. Apologies for the lack of formatting, me cave man, me don't know how.

    Read the article

  • How can I create and manage a multi-tenant ASP MVC application

    - by Wizzarding
    Hi, I want to create a multi-tenant application that uses the hostname to determine the customer. For example: CustomerOne.myapp.com AnotherCo.myapp.com AndOneMore.myapp.com ... I can do the database and security side with no problems, I can also get the hostname from the URL, but what I am struggling to find out is how to create the basic plumbing that would allow a new customer to sign up online, provide their company name, and for the application to create the new URL, ready to be used straight away. Can anyone help? Thanks, Rob.

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >