Search Results

Search found 1555 results on 63 pages for 'scott a'.

Page 11/63 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • What are these stray zero-byte files extracted from tarball? (OSX)

    - by Scott M
    I'm extracting a folder from a tarball, and I see these zero-byte files showing up in the result (where they are not in the source.) Setup (all on OS X): On machine one, I have a directory /My/Stuff/Goes/Here/ containing several hundred files. I build it like this tar -cZf mystuff.tgz /My/Stuff/Goes/Here/ On machine two, I scp the tgz file to my local directory, then unpack it. tar -xZf mystuff.tgz It creates ~scott/My/Stuff/Goes/, but then under Goes, I see two files: Here/ - a directory, Here.bGd - a zero byte file. The "Here.bGd" zero-byte file has a random 3-character suffix, mixed upper and lower-case characters. It has the same name as the lowest-level directory mentioned in the tar-creation command. It only appears at the lowest level directory named. Anybody know where these come from, and how I can adjust my tar creation to get rid of them? Update: I checked the table of contents on the files using tar tZvf: toc does not list the zero-byte files, so I'm leaning toward the suggestion that the uncompress machine is at fault. OS X is version 10.5.5 on the unzip machine (not sure how to check the filesystem type). Tar is GNU tar 1.15.1, and it came with the machine.

    Read the article

  • Select Union Query problem

    - by Krishma
    I have 2 tables Table A id name ------------ 1 Scott 2 Dan 3 Sam Table B id name ------------ 1 Dan 2 Andi 3 Jess Result needs to be Id Name Found 1 Scott A 2 Dan C i.e. found in both 3 Sam A 2 Andi B 3 Jess B I am able to do UNION to fetch the result but how i generate column Founds. Any idea ?? Thank you in advance :)

    Read the article

  • Maze Navigation in Player Stage with Roomba

    - by Scott
    Here is my code: /* Scott Landau Robot Lab Assignment 1 */ // Standard Java Libs import java.io.*; // Player/Stage Libs import javaclient2.*; import javaclient2.structures.*; import javaclient2.structures.sonar.*; // Begin public class SpinningRobot { public static Position2DInterface pos = null; public static LaserInterface laser = null; public static void main(String[] args) { PlayerClient robot = new PlayerClient("localhost", 6665); laser = robot.requestInterfaceLaser(0, PlayerConstants.PLAYER_OPEN_MODE); pos = robot.requestInterfacePosition2D(0,PlayerConstants.PLAYER_OPEN_MODE); robot.runThreaded (-1, -1); pos.setSpeed(0.5f, -0.25f); // end pos float x, y; x = 46.0f; y = -46.0f; boolean done = false; while( !done ){ if(laser.isDataReady()) { float[] laser_data = laser.getData().getRanges(); System.out.println("== IR Sensor =="); System.out.println("Left Wall Distance: "+laser_data[360]); System.out.println("Right Wall Distance: " +laser_data[0]); // if laser doesn't reach left wall, move to detect it // so we can guide using left wall if ( laser_data[360] < 0.6f ) { while ( laser_data[360] < 0.6f ) { pos.setSpeed(0.5f, -0.5f); } } else if ( laser_data[0] < 0.6f ) { while(laser_data[0<0.6f) { pos.setSpeed(0.5f, 0.5f); } } pos.setSpeed(0.5f, -0.25f); // end pos? done = ( (pos.getX() == x) && (pos.getY() == y) ); } } } } // End I was trying to have the Roomba go continuously at a slight right curve, quickly turning away from each wall it came to close to if it recognized it with it's laser. I can only use laser_data[360] and laser_data[0] for this one robot. I think this would eventually navigate the maze. However, I am using the Player Stage platform, and Stage freezes when the Roomba comes close to a wall using this code, I have no idea why. Also, if you can think of a better maze navigation algorithm, please let me know. Thank you!

    Read the article

  • How to eager load sibling data using LINQ to SQL?

    - by Scott
    The goal is to issue the fewest queries to SQL Server using LINQ to SQL without using anonymous types. The return type for the method will need to be IList<Child1>. The relationships are as follows: Parent Child1 Child2 Grandchild1 Parent Child1 is a one-to-many relationship Child1 Grandchild1 is a one-to-n relationship (where n is zero to infinity) Parent Child2 is a one-to-n relationship (where n is zero to infinity) I am able to eager load the Parent, Child1 and Grandchild1 data resulting in one query to SQL Server. This query with load options eager loads all of the data, except the sibling data (Child2): DataLoadOptions loadOptions = new DataLoadOptions(); loadOptions.LoadWith<Child1>(o => o.GrandChild1List); loadOptions.LoadWith<Child1>(o => o.Parent); dataContext.LoadOptions = loadOptions; IQueryable<Child1> children = from child in dataContext.Child1 select child; I need to load the sibling data as well. One approach I have tried is splitting the query into two LINQ to SQL queries and merging the result sets together (not pretty), however upon accessing the sibling data it is lazy loaded anyway. Adding the sibling load option will issue a query to SQL Server for each Grandchild1 and Child2 record (which is exactly what I am trying to avoid): DataLoadOptions loadOptions = new DataLoadOptions(); loadOptions.LoadWith<Child1>(o => o.GrandChild1List); loadOptions.LoadWith<Child1>(o => o.Parent); loadOptions.LoadWith<Parent>(o => o.Child2List); dataContext.LoadOptions = loadOptions; IQueryable<Child1> children = from child in dataContext.Child1 select child; exec sp_executesql N'SELECT * FROM [dbo].[Child2] AS [t0] WHERE [t0].[ForeignKeyToParent] = @p0',N'@p0 int',@p0=1 exec sp_executesql N'SELECT * FROM [dbo].[Child2] AS [t0] WHERE [t0].[ForeignKeyToParent] = @p0',N'@p0 int',@p0=2 exec sp_executesql N'SELECT * FROM [dbo].[Child2] AS [t0] WHERE [t0].[ForeignKeyToParent] = @p0',N'@p0 int',@p0=3 exec sp_executesql N'SELECT * FROM [dbo].[Child2] AS [t0] WHERE [t0].[ForeignKeyToParent] = @p0',N'@p0 int',@p0=4 I've also written LINQ to SQL queries to join in all of the data in hopes that it would eager load the data, however when the LINQ to SQL EntitySet of Child2 or Grandchild1 are accessed it lazy loads the data. The reason for returning the IList<Child1> is to hydrate business objects. My thoughts are I am either: Approaching this problem the wrong way. Have the option of calling a stored procedure? My organization should not be using LINQ to SQL as an ORM? Any help is greatly appreciated. Thank you, -Scott

    Read the article

  • adding users, membership, and roles to site

    - by Alexander
    I have followed scott's gu tutorial here I uploaded the whole database to my site. Before doing what Scott's says I had one username stored in the membership. How can I create an additional user now that the table is in the web host? I can see that there's aspnet_Membership, aspnet_Applications, etc..etc

    Read the article

  • Creating directory?

    - by Vineet
    When iam creating directory using sytem as user create directory emp_dir1 AS "'C:\Documents and Settings\Administrator\Desktop\vin.txt"; vin.txt is my file it creates it. same when i do using user Scott it gives an error for path of file that "Identifier is too long" but when i put this file path in single quotes instead of double quotes for scott, it creates it. What is the reason behind?

    Read the article

  • MIX 2010 Covert Operations Day 2 Silverlight + Windows 7 Phone

    - by GeekAgilistMercenary
    Left the Circus Circus and headed to the geek circus at Mandalay Bay.  Got in, got some breakfast, met a few more people and headed to the keynote. Upon arriving the crew I was hanging with at the event; Erik Mork, Beth Murray, and Brian Henderson and I were entertained with several other thousand geeks by the wicked yo-yoing. The first video demo of something was of Bing Maps and various aspects of Microsoft Research integrated together.  Namely the pictures, put in place, on real 3d element maps of various environments. Silverlight Scott Guthrie, as one would guess, kicked off the keynote.  His first point was that user experience has become a priority at Microsoft.  This can be seen by any observant soul with the release and push of Expression, Silverlight, and the other tools.  This is even more apparent when one takes note of Microsoft bringing in people that can actually do good design and putting them at the forefront. The next thing Scott brought up was a few key points about Silverlight.  Currently Silverlight is a little over 2 years old and has achieved a pretty solid 60% penetration.  Silverlight has all sorts of capabilities that have been developed and are now provided as open source including;  ad injection, smoothing, playback editing, and more.  Another thing he showed, which really struck me as awesome being in the analytics space, was the Olympics and a quick glimpse of the ad statistics, viewer experience, video playback performance, audience trends, and overall viewer participation.  All of it rendered in Silverlight in beautiful detail. The key piece of Scott's various points were all punctuated with the fact that all of this code is available as open source.  Not only is Microsoft really delving into this design element of things, they're getting involved in the right ways. One of the last points I'll bring up about Silverlight 4 is the ability to have HD video on a monitor, and an entirely different activity being done on the other monitor, effectively making Silverlight the only RIA framework that supports multi-monitor support.  Overall, Silverlight is continuing to impress – providing superior capabilities tit-for-tat with the competition. Windows 7 Phone The Windows 7 Phone has 3 primary buttons (yes, more than the iPhone, don't let your mind explode!!).  Start, Search, and Back control all of the needed functionality of the phone.  At the same time, of course, there is the multi-touch, touch, and other interactive abilities of the interface.  The intent, once start is pressed is to have all the information that a phone owner wants displayed immediately.  Avoiding the scrolling through pages of apps or rolling a ball to get through multitudes of other non-interactive phone interfaces.  The Windows 7 Phone simply has the data right in front of you, basically a phone dashboard.  From there it is easy to dive into the interactive areas of the phone. Each area of the interface of the phone is broken into hubs.  These hubs include applications, data, and other things based on a relative basis.  This basis being determined by the user.  These applications interact on many other levels, and form a kind of relationship between each other adding more and more meta-data to the phone user, their interactions between the applications, and of course the social element of their interactions on the phone.  This makes this phone a practical must have for a marketer involved in social media.  The level of wired together interaction is massive, and of course, if you've seen Office Outlook 2010 you know that the power that is pulled into the phone by being tied to Outlook is massive. Joe Belfiore also showed several UI & specifically UX elements of the phone interface that allows paging to be instinctual by simple clipped items, flipping page to page, and other excellent user experience advances for phone devices.  Belfiore's also showed how his people hub had a massive list of people, with pictures, all from various different social networks and other associated relations.  The rendering, speed, and viewing of these people's, their pictures, their social network information, and other characteristics was smooth and in some situations unbelievably rendered.  This demo showed some of the great power of the beta phone, which isn't even as powerful as the planned end device. Joe finished up by jumping into the music, videos, and other media with the Zune Component of the Windows 7 Mobile Phone.  This was all good stuff, but I'll get to what really sold me on the media element in a moment. When Joe was done, Scott Guthrie stepped back up to walk through building a Windows 7 Mobile Phone.  This is were I have to give serious props.  He built this application, in Visual Studio 2010, in front of 2000+ people.  That was cool, but what really was amazing that he build the application in about 2 minutes.  The IDE, side by side design that is standard in Visual Studio is light years ahead of x-Code or any of the iPhone IDEs.  The Windows 7 Mobile System, if it can get market penetration, poses a technologically superior development and phone platform over anything on the market right now.  The biggest problem with the phone, is it just isn't available yet.  I personally can't wait for a chance to build some apps for the new Windows Phone. Netflix, I May Start Up an Account Again! When I get my Windows 7 Phone device, I am absolutely getting a Netflix account again.  The Vertigo crew, as I wrote on Twitter "#MIX10 Props @seesharp on @netflix demo", displayed an application on the phone for Netflix that actually ran HD Video of Rescue Me (with Dennis Leary).  The video played back smooth as it would on a dedicated computer, I was instantly sold.  So this didn't actually sell me on the phone, because I'm already sold, but it did sell me whole heartedly on the media capabilities of the pending phone. Anyway, I try not to do this but I may double post today.  Lunch is over and I'm off to another session very near and dear to the heart of my occupation, Analytics Tracking.  Stay tuned and I should have that post up by the end of the day. Original Post – Check out my other blog for even more technical ramblings and reads.

    Read the article

  • Agile Testing Days 2012 – Day 3 – Agile or agile?

    - by Chris George
    Another early start for my last Lean Coffee of the conference, and again it was not wasted. We had some really interesting discussions around how to determine what test automation is useful, if agile is not faster, why do it? and a rather existential discussion on whether unicorns exist! First keynote of the day was entitled “Fast Feedback Teams” by Ola Ellnestam. Again this relates nicely to the releasing faster talk on day 2, and something that we are looking at and some teams are actively trying. Introducing the notion of feedback, Ola describes a game he wrote for his eldest child. It was a simple game where every time he clicked a button, it displayed “You’ve Won!”. He then changed it to be a Win-Lose-Win-Lose pattern and watched the feedback from his son who then twigged the pattern and got his younger brother to play, alternating turns… genius! (must do that with my children). The idea behind this was that you need that feedback loop to learn and progress. If you are not getting the feedback you need to close that loop. An interesting point Ola made was to solve problems BEFORE writing software. It may be that you don’t have to write anything at all, perhaps it’s a communication/training issue? Perhaps the problem can be solved another way. Writing software, although it’s the business we are in, is expensive, and this should be taken into account. He again mentions frequent releases, and how they should be made as soon as stuff is ready to be released, don’t leave stuff on the shelf cause it’s not earning you anything, money or data. I totally agree with this and it’s something that we will be aiming for moving forwards. “Exceptions, Assumptions and Ambiguity: Finding the truth behind the story” by David Evans started off very promising by making references to ‘Grim up North’ referring to the north of England. Not sure it was appreciated by most of the audience, but it made me laugh! David explained how there are always risks associated with exceptions, giving the example of a one-way road near where he lives, with an exception sign giving rights to coaches to go the wrong way. Therefore you could merrily swing around the corner of the one way road straight into a coach! David showed the danger in making assumptions with lyrical quotes from Lola by The Kinks “I’m glad I’m a man, and so is Lola” and with a picture of a toilet flush that needed instructions to operate the full and half flush. With this particular flush, you pulled the handle all the way down to half flush, and half way down to full flush! hmmm, a bit of a crappy user experience methinks! Then through a clever use of a passage from the Jabberwocky, David then went onto show how mis-translation/ambiguity is the can completely distort the original meaning of something, and this is a real enemy of software development. This was all helping to demonstrate that the term Story is often heavily overloaded in the Agile world, and should really be stripped back to what it is really for, stating a business problem, and offering a technical solution. Therefore a story could be worded as “In order to {make some improvement}, we will { do something}”. The first ‘in order to’ statement is stakeholder neutral, and states the problem through requesting an improvement to the software/process etc. The second part of the story is the verb, the doing bit. So to achieve the ‘improvement’ which is not currently true, we will do something to make this true in the future. My PM is very interested in this, and he’s observed some of the problems of overloading stories so I’m hoping between us we can use some of David’s suggestions to help clarify our stories better. The second keynote of the day (and our last) proved to be the most entertaining and exhausting of the conference for me. “The ongoing evolution of testing in agile development” by Scott Barber. I’ve never had the pleasure of seeing Scott before… OMG I would love to have even half of the energy he has! What struck me during this presentation was Scott’s explanation of how testing has become the role/job that it is (largely) today, and how this has led to the need for ‘methodologies’ to make dev and test work! The argument that we should be trying to converge the roles again is a very valid one, and one that a couple of the teams at work are actively doing with great results. Making developers as responsible for quality as testers is something that has been lost over the years, but something that we are now striving to achieve. The idea that we (testers) should be testing experts/specialists, not testing ‘union members’, supports this idea so the entire team works on all aspects of a feature/product, with the ‘specialists’ taking the lead and advising/coaching the others. This leads to better propagation of information around the team, a greater holistic understanding of the project and it allows the team to continue functioning if some of it’s members are off sick, for example. Feeling somewhat drained from Scott’s keynote (but at the same time excited that alot of the points he raised supported actions we are taking at work), I headed into my last presentation for Agile Testing Days 2012 before having to make my way to Tegel to catch the flight home. “Thinking and working agile in an unbending world” with Pete Walen was a talk I was not going to miss! Having spoken to Pete several times during the past few days, I was looking forward to hearing what he was going to say, and I was not disappointed. Pete started off by trying to separate the definitions of ‘Agile’ as in the methodology, and ‘agile’ as in the adjective by pronouncing them the ‘english’ and ‘american’ ways. So Agile pronounced (Ajyle) and agile pronounced (ajul). There was much confusion around what the hell he was talking about, although I thought it was quite clear. Agile – Software development methodology agile – Marked by ready ability to move with quick easy grace; Having a quick resourceful and adaptable character. Anyway, that aside (although it provided a few laughs during the presentation), the point was that many teams that claim to be ‘Agile’ but are not, in fact, ‘agile’ by nature. Implementing ‘Agile’ methodologies that are so prescriptive actually goes against the very nature of Agile development where a team should anticipate, adapt and explore. Pete made a valid point that very few companies intentionally put up roadblocks to impede work, so if work is being blocked/delayed, why? This is where being agile as a team pays off because the team can inspect what’s going on, explore options and adapt their processes. It is through experimentation (and that means trying and failing as well as trying and succeeding) that a team will improve and grow leading to focussing on what really needs to be done to achieve X. So, that was it, the last talk of our conference. I was gutted that we had to miss the closing keynote from Matt Heusser, as Matt was another person I had spoken too a few times during the conference, but the flight would not wait, and just as well we left when we did because the traffic was a nightmare! My Takeaway Triple from Day 3: Release often and release small – don’t leave stuff on the shelf Keep the meaning of the word ‘agile’ in mind when working in ‘Agile Look at testing as more of a skill than a role  

    Read the article

  • April 14th Links: ASP.NET, ASP.NET MVC, ASP.NET Web API and Visual Studio

    - by ScottGu
    Here is the latest in my link-listing blog series: ASP.NET Easily overlooked features in VS 11 Express for Web: Good post by Scott Hanselman that highlights a bunch of easily overlooked improvements that are coming to VS 11 (and specifically the free express editions) for web development: unit testing, browser chooser/launcher, IIS Express, CSS Color Picker, Image Preview in Solution Explorer and more. Get Started with ASP.NET 4.5 Web Forms: Good 5-part tutorial that walks-through building an application using ASP.NET Web Forms and highlights some of the nice improvements coming with ASP.NET 4.5. What is New in Razor V2 and What Else is New in Razor V2: Great posts by Andrew Nurse, a dev on the ASP.NET team, about some of the new improvements coming with ASP.NET Razor v2. ASP.NET MVC 4 AllowAnonymous Attribute: Nice post from David Hayden that talks about the new [AllowAnonymous] filter introduced with ASP.NET MVC 4. Introduction to the ASP.NET Web API: Great tutorial by Stephen Walher that covers how to use the new ASP.NET Web API support built-into ASP.NET 4.5 and ASP.NET MVC 4. Comprehensive List of ASP.NET Web API Tutorials and Articles: Tugberk Ugurlu links to a huge collection of articles, tutorials, and samples about the new ASP.NET Web API capability. Async Mashups using ASP.NET Web API: Nice post by Henrik on how you can use the new async language support coming with .NET 4.5 to easily and efficiently make asynchronous network requests that do not block threads within ASP.NET. ASP.NET and Front-End Web Development Visual Studio 11 and Front End Web Development - JavaScript/HTML5/CSS3: Nice post by Scott Hanselman that highlights some of the great improvements coming with VS 11 (including the free express edition) for front-end web development. HTML5 Drag/Drop and Async Multi-file Upload with ASP.NET Web API: Great post by Filip W. that demonstrates how to implement an async file drag/drop uploader using HTML5 and ASP.NET Web API. Device Emulator Guide for Mobile Development with ASP.NET: Good post from Rachel Appel that covers how to use various device emulators with ASP.NET and VS to develop cross platform mobile sites. Fixing these jQuery: A Guide to Debugging: Great presentation by Adam Sontag on debugging with JavaScript and jQuery.  Some really good tips, tricks and gotchas that can save a lot of time. ASP.NET and Open Source Getting Started with ASP.NET Web Stack Source on CodePlex: Fantastic post by Henrik (an architect on the ASP.NET team) that provides step by step instructions on how to work with the ASP.NET source code we recently open sourced. Contributing to ASP.NET Web Stack Source on CodePlex: Follow-on to the post above (also by Henrik) that walks-through how you can submit a code contribution to the ASP.NET MVC, Web API and Razor projects. Overview of the WebApiContrib project: Nice post by Pedro Reys on the new open source WebApiContrib project that has been started to deliver cool extensions and libraries for use with ASP.NET Web API. Entity Framework Entity Framework 5 Performance Improvements and Performance Considerations for EF5:  Good articles that describes some of the big performance wins coming with EF5 (which will ship with both .NET 4.5 and ASP.NET MVC 4). Automatic compilation of LINQ queries will yield some significant performance wins (up to 600% faster). ASP.NET MVC 4 and EF Database Migrations: Good post by David Hayden that covers the new database migrations support within EF 4.3 which allows you to easily update your database schema during development - without losing any of the data within it. Visual Studio What's New in Visual Studio 11 Unit Testing: Nice post by Peter Provost (from the VS team) that talks about some of the great improvements coming to VS11 for unit testing - including built-in VS tooling support for a broad set of unit test frameworks (including NUnit, XUnit, Jasmine, QUnit and more) Hope this helps, Scott

    Read the article

  • What is Agile Modeling and why do I need it?

    What is Agile Modeling and why do I need it? Agile Modeling is an add-on to existing agile methodologies like Extreme programming (XP) and Rational Unified Process (RUP). Agile Modeling enables developers to develop a customized software development process that actually meets their current development needs and is flexible enough to adjust in the future. According to Scott Ambler, Agile Modeling consists of five core values that enable this methodology to be effective and light weight Agile Modeling Core Values: Communication Simplicity Feedback Courage Humility Communication is a key component to any successful project. Open communication between stakeholder and the development team is essential when developing new applications or maintaining legacy systems. Agile models promote communication amongst software development teams and stakeholders. Furthermore, Agile Models provide a common understanding of an application for members of a software development team allowing them to have a universal common point of reference. The use of simplicity in Agile Models enables the exploration of new ideas and concepts through the use of basic diagrams instead of investing the time in writing tens or hundreds of lines of code. Feedback in regards to application development is essential. Feedback allows a development team to confirm that the development path is on track. Agile Models allow for quick feedback from shareholders because minimal to no technical expertise is required to understand basic models. Courage is important because you need to make important decisions and be able to change direction by either discarding or refactoring your work when some of your decisions prove inadequate, according to Scott Ambler. As a member of a development team, we must admit that we do not know everything even though some of us think we do. This is where humility comes in to play. Everyone is a knowledge expert in their own specific domain. If you need help with your finances then you would consult an accountant. If you have a problem or are in need of help with a topic why would someone not consult with a subject expert? An effective approach is to assume that everyone involved with your project has equal value and therefore should be treated with respect. Agile Model Characteristics: Purposeful Understandable Sufficiently Accurate Sufficiently Consistent Sufficiently Detailed Provide Positive Value Simple as Possible Just Fulfill Basic Requirements According to Scott Ambler, Agile models are the most effective possible because the time that is invested in the model is just enough effort to complete the job. Furthermore, if a model isn’t good enough yet then additional effort can be invested to get more value out of the model. However if a model is good enough, for the current needs, or surpass the current needs, then any additional work done on the model would be a waste. It is important to remember that good enough is in the eye of the beholder, so this can be tough. In order for Agile Models to work effectively Active Stakeholder need to participation in the modeling process. Finally it is also very important to model with others, this allows for additionally input ensuring that all the shareholders needs are reflected in the models. How can Agile Models be incorporated in to our projects? Agile Models can be incorporated in to our project during the requirement gathering and design phases. As requirements are gathered the models should be updated to incorporate the new project details as they are defined and updated. Additionally, the Agile Models created during the requirement phase can be the bases for the models created during the design phase.  It is important to only add to the model when the changes fit within the agile model characteristics and they do not over complicate the design.

    Read the article

  • Silverlight Firestarter thoughts, and thanks to one and all!

    - by Dave Campbell
    A few metrics that of course got out of hand, but some may find interesting:   1/2 My share of the MVP of the Year award in February of 2009 with Laurent Bugnion 2 Number of degrees I hold: B.S., M.S. Electrical Engineering 3 Number of years in the U.S. Army 3.5 Number of years SilverlighCream has been posted 4 Number of times awarded MVP 6 Number of professional positions I've worked: Antenna Rigger, Boilermaker, Musician, Electronic Technician, Hardware Engineer, Software Engineer 16 Number of companies I've worked for during my career as an Engineer 19 Age at which I turned my first line of code 28 Age at which I hit the workforce as an Engineer 33 Number of years working as an Engineer 43 Number of years writing code 62 Number of years since instantiation 116 Number of tags to search SilverlightCream with 645 Number of blogs I view to find articles (at this moment) 664 Number of articles tagged wp7dev at SilverlightCream right now 700 Number of Twitter followers for WynApse 981 Number of individual bloggers in the SilverlightCream database 1002 Number of SilverlightCream blogposts 1100 Number of people live in Redmond for the Firestarter (I think) 1428 Number of total blogposts at GeeksWithBlogs (not counting this one) 4200 Number of Feedburner subscribers (approximately) 6500 Number of Twitter followers for SilverlightNews (approximately) 7087 Number of posts tagged and aggregated at SilverlightCream right now 13000 Number of people registered to watch the Firestarter online (I think) The overwhelming feeling I have returning from the Silverlight Firestarter: Priceless There is absolutely no way that I could personally thank everyone that over the last few years has held their hand out and offered me a step up to get to the point that Scott Guthrie called me out in his keynote. So I'm just going to hit the highlights here... Scott Guthrie Thanks for not only being the level you are at Microsoft, but for being so approachable, easy to talk to, willing to help everyone, and above all knowledgable. My first level manager at my last position asked if Visual Studio was a graphics program... and you step up to a laptop at a conference and type "File->New Program" ... 'nuff said... oh yeah, thanks for the shoutout! John Papa Thanks for being a good friend, ramroding the Firestarter, being a great guy to be around, and for the poster... holy crap is that cool. Tim Heuer Thanks for all you did as a great DE in Phoenix, and for helping out so many of us, of course being a great guy, and for the poster as well... I think you and John shared that task. In no order at all my buddy Michael Washington, Laurent Bugnion (the other half of the first Silverlight MVP of the Year) Tim Sneath, Mike Harsh, Chad Campbell and Bryant Likes (from back in the day), Adam Kinney, Jesse Liberty, Jeff Paries, Pete Brown, András Velvárt, David Kelly, Michael Palermo, Scott Cate, Erik Mork, and on and on... don't feel bad if your name didn't appear, I have simply too many supporters to name. Silverlight Firestarter Indeed All the people mentioned here, and all the MVPs knew Silverlight was NOT dead, but because of a very unfortunate circumstance, the popular media opinion became that. Consequently the Firestarter exploded from a laid-back event to a global conference. People worked their ass off getting bits ready and presentations using those bits. All to stem the flow of misinformation. All involved please accept my personal thanks for an absolutely awesome job. I had the priviledge of watching the 'prep' on Wednesday afternoon, and was blown away the first time I saw the 3D demo... and have been blown away every time I've seen it since. Not to mention all the other goodness in Silverlight 5. Yes I hit 1000 on my blog, but more importantly, all of you are blogging and using Silverlight, and Microsoft hit one completely out of the park... no... they knocked it out of the neighborhood with the Firestarter. It was amazing to be there for it, and it will be awesome to use the new bits as we get them. Keep reading, there's tons more to come with Silverlight and SilverlightCream following along behind. As usual, this old hacker is humbled to be allowed to play with all the cool kids... Thanks one and all for everything, and Stay in the 'Light

    Read the article

  • ODI 11g – Oracle Multi Table Insert

    - by David Allan
    With the IKM Oracle Multi Table Insert you can generate Oracle specific DML for inserting into multiple target tables from a single query result – without reprocessing the query or staging its result. When designing this to exploit the IKM you must split the problem into the reusable parts – the select part goes in one interface (I named SELECT_PART), then each target goes in a separate interface (INSERT_SPECIAL and INSERT_REGULAR). So for my statement below… /*INSERT_SPECIAL interface */ insert  all when 1=1 And (INCOME_LEVEL > 250000) then into SCOTT.CUSTOMERS_NEW (ID, NAME, GENDER, BIRTH_DATE, MARITAL_STATUS, INCOME_LEVEL, CREDIT_LIMIT, EMAIL, USER_CREATED, DATE_CREATED, USER_MODIFIED, DATE_MODIFIED) values (ID, NAME, GENDER, BIRTH_DATE, MARITAL_STATUS, INCOME_LEVEL, CREDIT_LIMIT, EMAIL, USER_CREATED, DATE_CREATED, USER_MODIFIED, DATE_MODIFIED) /* INSERT_REGULAR interface */ when 1=1  then into SCOTT.CUSTOMERS_SPECIAL (ID, NAME, GENDER, BIRTH_DATE, MARITAL_STATUS, INCOME_LEVEL, CREDIT_LIMIT, EMAIL, USER_CREATED, DATE_CREATED, USER_MODIFIED, DATE_MODIFIED) values (ID, NAME, GENDER, BIRTH_DATE, MARITAL_STATUS, INCOME_LEVEL, CREDIT_LIMIT, EMAIL, USER_CREATED, DATE_CREATED, USER_MODIFIED, DATE_MODIFIED) /*SELECT*PART interface */ select        CUSTOMERS.EMAIL EMAIL,     CUSTOMERS.CREDIT_LIMIT CREDIT_LIMIT,     UPPER(CUSTOMERS.NAME) NAME,     CUSTOMERS.USER_MODIFIED USER_MODIFIED,     CUSTOMERS.DATE_MODIFIED DATE_MODIFIED,     CUSTOMERS.BIRTH_DATE BIRTH_DATE,     CUSTOMERS.MARITAL_STATUS MARITAL_STATUS,     CUSTOMERS.ID ID,     CUSTOMERS.USER_CREATED USER_CREATED,     CUSTOMERS.GENDER GENDER,     CUSTOMERS.DATE_CREATED DATE_CREATED,     CUSTOMERS.INCOME_LEVEL INCOME_LEVEL from    SCOTT.CUSTOMERS   CUSTOMERS where    (1=1) Firstly I create a SELECT_PART temporary interface for the query to be reused and in the IKM assignment I state that it is defining the query, it is not a target and it should not be executed. Then in my INSERT_SPECIAL interface loading a target with a filter, I set define query to false, then set true for the target table and execute to false. This interface uses the SELECT_PART query definition interface as a source. Finally in my final interface loading another target I set define query to false again, set target table to true and execute to true – this is the go run it indicator! To coordinate the statement construction you will need to create a package with the select and insert statements. With 11g you can now execute the package in simulation mode and preview the generated code including the SQL statements. Hopefully this helps shed some light on how you can leverage the Oracle MTI statement. A similar IKM exists for Teradata. The ODI IKM Teradata Multi Statement supports this multi statement request in 11g, here is an extract from the paper at www.teradata.com/white-papers/born-to-be-parallel-eb3053/ Teradata Database offers an SQL extension called a Multi-Statement Request that allows several distinct SQL statements to be bundled together and sent to the optimizer as if they were one. Teradata Database will attempt to execute these SQL statements in parallel. When this feature is used, any sub-expressions that the different SQL statements have in common will be executed once, and the results shared among them. It works in the same way as the ODI MTI IKM, multiple interfaces orchestrated in a package, each interface contributes some SQL, the last interface in the chain executes the multi statement.

    Read the article

  • Microsoft Cloud Day - the ups and downs

    - by Charles Young
    The term ‘cloud’ can sometimes obscure the obvious.  Today’s Microsoft Cloud Day conference in London provided a good example.  Scott Guthrie was halfway through what was an excellent keynote when he lost network connectivity.  This proved very disruptive to his presentation which centred on a series of demonstrations of the Azure platform in action.  Great efforts were made to find a solution, but no quick fix presented itself.  The venue’s IT facilities were dreadful – no WiFi, poor 3G reception (forget 4G…this is the UK) and, unbelievably, no-one on hand from the venue staff to help with infrastructure issues.  Eventually, after an unscheduled break, a solution was found and Scott managed to complete his demonstrations.  Further connectivity issues occurred during the day. I can say that the cause was prosaic.  A member of the venue staff had interfered with a patch board and inadvertently disconnected Scott Guthrie’s machine from the network by pulling out a cable. I need to state the obvious here.  If your PC is disconnected from the network it can’t communicate with other systems.  This could include a machine under someone’s desk, a mail server located down the hall, a server in the local data centre, an Internet search engine or even, heaven forbid, a role running on Azure. Inadvertently disconnecting a PC from the network does not imply a fundamental problem with the cloud or any specific cloud platform.  Some of the tweeted comments I’ve seen today are analogous to suggesting that, if you accidently unplug your microwave from the mains, this suggests some fundamental flaw with the electricity supply to your house.   This is poor reasoning, to say the least. As far as the conference was concerned, the connectivity issue in the keynote, coupled with some later problems in a couple of presentations, served to exaggerate the perception of poor organisation.   Software problems encountered before the conference prevented the correct set-up of a smartphone app intended to convey agenda information to attendees.  Although some information was available via this app, the organisers decided to print out an agenda at the last moment.  Unfortunately, the agenda sheet did not convey enough information, and attendees were forced to approach conference staff through the day to clarify locations of the various presentations. Despite these problems, the overwhelming feedback from conference attendees was very positive.  There was a real sense of excitement in the morning keynote.  For many, this was their first sight of new Azure features delivered in the ‘spring’ release.  The most common reaction I heard was amazement and appreciation that Azure’s new IaaS features deliver built-in template support for several flavours of Linux from day one.  This coupled with open source SDKs and several presentations on Azure’s support for Java, node.js, PHP, MongoDB and Hadoop served to communicate that the Azure platform is maturing quickly.  The new virtual network capabilities also surprised many attendees, and the much improved portal experience went down very well. So, despite some very irritating and disruptive problems, the event served its purpose well, communicating the breadth and depth of the newly upgraded Azure platform.  I enjoyed the day very much.

    Read the article

  • F1 Pit Pragmatics

    - by mikef
    "I hate computers. No, really, I hate them. I love the communications they facilitate, I love the conveniences they provide to my life. but I actually hate the computers themselves." - Scott Merrill, 'I hate computers: confessions of a Sysadmin' If Scott's goal was to polarize opinion and trigger raging arguments over the 'real reasons why computers suck', then he certainly succeeded. Impassioned vitriol sits side-by-side with rational debate. Yet Scott's fundamental point is absolutely on the money - Computers are a means to an end. The IT industry is finally starting to put weight behind the notion that good User Experience is an absolutely crucial goal, a cause championed by the likes of Microsoft's Bill Buxton, and which Apple's increasingly ubiquitous touch screen interface exemplifies. However, that doesn't change the fact that, occasionally, you just have to man up and deal with complex systems. In fact, sometimes you just need to sacrifice everything else in the name of performance. You'll find a perfect example of this Faustian bargain in Trevor Clarke's fascinating look into the (diabolical) IT infrastructure of modern F1 racing - high performance, high availability. high everything. To paraphrase, each car has up to 100 sensors, transmitting around 30Gb of data over the course of a race (70% in real-time). This data is then processed by no less than 3 servers (per car) so that the engineers in the pit have access to telemetry, strategy information, timing feeds, a connection back to the operations room in the team's home base - the list goes on. All of this while the servers are exposed "to carbon dust, oil, vibration, rain, heat, [and] variable power". Now, this is admittedly an extreme context where there's no real choice but to use complex systems where ease-of-use is, at best, a secondary concern. The flip-side is seen in small-scale personal computing such as that seen in Apple's iDevices, which are incredibly intuitive but limited in their scope. In terms of what kinds of systems they prefer to use, I suspect that most SysAdmins find themselves somewhere along this axis of Power vs. Usability, and which end of this axis you resonate with also hints at where you think the IT industry should focus its energy. Do you see yourself in the F1 pit, making split-second decisions, wrestling with information flows and reticent hardware to bend them to your will? If so, I imagine you feel that computers are subtle tools which need to be tuned and honed, using the advanced knowledge possessed only by responsible SysAdmins (If you have an iPhone, I suspect it's jail-broken). If the machines throw enigmatic errors, it's the price of flexibility and raw power. Alternatively, would you prefer to have your role more accessible, with users empowered by knowledge, spreading the load of managing IT environments? In that case, then you want hardware and software to have User Experience as their primary focus, and are of the "means to an end" school of thought (you're probably also fed up with users not listening to you when you try and help). At its heart, the dichotomy is between raw power (which might be difficult to use) and ease-of-use (which might have some limitations, but you can be up and running immediately). Of course, the ultimate goal is a fusion of flexibility, power and usability all in one system. It's achievable in specific software environments, and Red Gate considers it a target worth aiming for, but in other cases it's a goal right up there with cold fusion. I think it'll be a long time before we see it become ubiquitous. In the meantime, are you Power-Hungry or a Champion of Usability? Cheers, Michael Francis Simple Talk SysAdmin Editor

    Read the article

  • Kill all the project files!

    - by jamiet
    Like many folks I’m a keen podcast listener and yesterday my commute was filled by listening to Scott Hunter being interviewed on .Net Rocks about the next version of ASP.Net. One thing Scott said really struck a chord with me. I don’t remember the full quote but he was talking about how the ASP.Net project file (i.e. the .csproj file) is going away. The rationale being that the main purpose of that file is to list all the other files in the project, and that’s something that the file system is pretty good at. In Scott’s own words (that someone helpfully put in the comments): A file that lists files is really redundant when the OS already does this Romeliz Valenciano correctly pointed out on Twitter that there will still be a project.json file however no longer will there be a need to keep a list of files in a project file. I suspect project.json will simply contain a list of exclusions where necessary rather than the current approach where the project file is a list of inclusions. On the face of it this seems like a pretty good idea. I’ve long been a fan of convention over configuration and this is a great example of that. Instead of listing all the files in a separate file, just treat all the files in the directory as being part of the project. Ostensibly the approach is if its in the directory, its part of the project. Simple. Now I’m not an ASP.net developer, far from it, but it did occur to me that the same approach could be applied to the two Visual Studio project types that I am most familiar with, SSIS & SSDT. Like many people I’ve long been irritated by SSIS projects that display a faux file system inside Solution Explorer. As you can see in the screenshot below the project has Miscellaneous and Connection Managers folders but no such folders exist on the file system: This may seem like a minor thing but it means useful Solution Explorer features like Show All Files and Open Folder in Windows Explorer don’t work and quite frankly it makes me feel like a second class citizen in the Microsoft ecosystem. I’m a developer, treat me like one. Don’t try and hide the detail of how a project works under the covers, show it to me. I’m a big boy, I can handle it! Would it not be preferable to simply treat all the .dtsx files in a directory as being part of a project? I think it would, that’s pretty much all the .dtproj file does anyway (that, and present things in a non-alphabetic order – something else that wildly irritates me), so why not just get rid of the .dtproj file? In the case of SSDT the .sqlproj actually does a whole lot more than simply list files because it also states the BuildAction of each file (Build, NotInBuild, Post-Deployment, etc…) but I see no reason why the convention over configuration approach can’t help us there either. Want to know which is the Post-deployment script? Well, its the one called Post-DeploymentScript.sql! Simple! So that’s my new crusade. Let’s kill all the project files (well, the .dtproj & .sqlproj ones anyway). Are you with me? @Jamiet

    Read the article

  • The Definitive C++ Book Guide and List

    - by grepsedawk
    After more than a few questions about deciding on C++ books I thought we could make a better community wiki version. Providing QUALITY books and an approximate skill level. Maybe we can add a short blurb/description about each book that you have personally read / benefited from. Feel free to debate quality, headings, etc. Note: There is a similar post for C: The Definitive C Book Guide and List Reference Style - All Levels The C++ Programming Language - Bjarne Stroustrup C++ Standard Library Tutorial and Reference - Nicolai Josuttis Beginner Introductory: C++ Primer - Stanley Lippman / Josée Lajoie / Barbara E. Moo Accelerated C++ - Andrew Koenig / Barbara Moo Thinking in C++ - Bruce Eckel (2 volumes, 2nd is more about standard library, but still very good) Best practices: Effective C++ - Scott Meyers Effective STL - Scott Meyers Intermediate More Effective C++ - Scott Meyers Exceptional C++ - Herb Sutter More Exceptional C++ - Herb Sutter C++ Coding Standards: 101 Rules, Guidelines, and Best Practices - Herb Sutter / Andrei Alexandrescu C++ Templates The Complete Guide - David Vandevoorde / Nicolai M. Josuttis Large Scale C++ Software Design - John Lakos Above Intermediate Modern C++ Design - Andrei Alexandrescu C++ Template Metaprogramming - David Abrahams and Aleksey Gurtovoy Inside the C++ Object Model - Stanley Lippman Classics / Older Note: Some information contained within these books may not be up to date and no longer considered best practice. The Design and Evolution of C++ - Bjarne Stroustrup Ruminations on C++ Andrew Koenig / Barbara Moo Advanced C++ Programming Styles and Idioms - James Coplien

    Read the article

  • IIS: 404 error on every file in a virtual directory.

    - by Scott Chamberlain
    I am trying to write my first WCF service for IIS 6.0. I followed the instructions on MSDN. I created the virtual directory, I can browse the directory fine but anything I click (even a sub-folder in that folder) gives me a 404 error. What am I missing that I can not access any files or folders? Any logs or whatnot you need just tell me where to find them in the comments and I will post them. UPDATE- Found the log, here is what it says when I connect and try to click on a sub folder. #Software: Microsoft Internet Information Services 6.0 #Version: 1.0 #Date: 2010-03-07 19:08:07 #Fields: date time s-sitename s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status 2010-03-07 19:08:07 W3SVC1 74.62.95.101 GET /prx2.php hash=AA70CBCE8DDD370B4A3E5F6500505C6FBA530220D856 80 - 221.192.199.35 Mozilla/4.0+(compatible;+MSIE+6.0;+Windows+NT+5.0) 404 0 2 #Software: Microsoft Internet Information Services 6.0 #Version: 1.0 #Date: 2010-03-07 22:21:20 #Fields: date time s-sitename s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status 2010-03-07 22:21:20 W3SVC1 127.0.0.1 GET /RemoteUserManagerService/ - 80 - 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+WOW64;+Trident/4.0;+.NET+CLR+3.0.04506.30;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.648;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729;+.NET4.0C;+.NET4.0E) 401 2 2148074254 2010-03-07 22:21:26 W3SVC1 127.0.0.1 GET /RemoteUserManagerService/ - 80 - 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+WOW64;+Trident/4.0;+.NET+CLR+3.0.04506.30;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.648;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729;+.NET4.0C;+.NET4.0E) 401 1 0 2010-03-07 22:21:26 W3SVC1 127.0.0.1 GET /RemoteUserManagerService/ - 80 webinfinity\srchamberlain 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+WOW64;+Trident/4.0;+.NET+CLR+3.0.04506.30;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.648;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729;+.NET4.0C;+.NET4.0E) 200 0 0 2010-03-07 22:21:29 W3SVC1 127.0.0.1 GET /RemoteUserManagerService/bin/ - 80 - 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+WOW64;+Trident/4.0;+.NET+CLR+3.0.04506.30;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.648;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729;+.NET4.0C;+.NET4.0E) 404 0 2

    Read the article

  • IIS: 404 error on every file in a virtual directory.

    - by Scott Chamberlain
    I am trying to write my first WCF service for IIS 6.0. I followed the instructions on MSDN. I created the virtual directory, I can browse the directory fine but anything I click (even a sub-folder in that folder) gives me a 404 error. What am I missing that I can not access any files or folders? Any logs or whatnot you need just tell me where to find them in the comments and I will post them. UPDATE- Found the log, here is what it says when I connect and try to click on a sub folder. #Software: Microsoft Internet Information Services 6.0 #Version: 1.0 #Date: 2010-03-07 19:08:07 #Fields: date time s-sitename s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status 2010-03-07 19:08:07 W3SVC1 74.62.95.101 GET /prx2.php hash=AA70CBCE8DDD370B4A3E5F6500505C6FBA530220D856 80 - 221.192.199.35 Mozilla/4.0+(compatible;+MSIE+6.0;+Windows+NT+5.0) 404 0 2 #Software: Microsoft Internet Information Services 6.0 #Version: 1.0 #Date: 2010-03-07 22:21:20 #Fields: date time s-sitename s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status 2010-03-07 22:21:20 W3SVC1 127.0.0.1 GET /RemoteUserManagerService/ - 80 - 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+WOW64;+Trident/4.0;+.NET+CLR+3.0.04506.30;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.648;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729;+.NET4.0C;+.NET4.0E) 401 2 2148074254 2010-03-07 22:21:26 W3SVC1 127.0.0.1 GET /RemoteUserManagerService/ - 80 - 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+WOW64;+Trident/4.0;+.NET+CLR+3.0.04506.30;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.648;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729;+.NET4.0C;+.NET4.0E) 401 1 0 2010-03-07 22:21:26 W3SVC1 127.0.0.1 GET /RemoteUserManagerService/ - 80 webinfinity\srchamberlain 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+WOW64;+Trident/4.0;+.NET+CLR+3.0.04506.30;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.648;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729;+.NET4.0C;+.NET4.0E) 200 0 0 2010-03-07 22:21:29 W3SVC1 127.0.0.1 GET /RemoteUserManagerService/bin/ - 80 - 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+5.2;+WOW64;+Trident/4.0;+.NET+CLR+3.0.04506.30;+.NET+CLR+2.0.50727;+.NET+CLR+3.0.04506.648;+.NET+CLR+3.0.4506.2152;+.NET+CLR+3.5.30729;+.NET4.0C;+.NET4.0E) 404 0 2 --Update again I found this here IIS6 Dynamic Content: A 404.2 entry in the W3C Extended Log file is recorded when a Web Extension is not enabled. Use the IIS Microsoft Management Console (MMC) snap-in to enable the appropriate Web extension. Default Web Extensions include: ASP, ASP.net, Server-Side Includes, WebDAV publishing, FrontPage Server Extensions, Common Gateway Interface (CGI). Custom extensions must be added and explicitly enabled. See the IIS 6.0 Help File for more information. I am guessing the 404 0 2 at the end of the log is a 404.2 error. I now know the why, I still don't know the how on how to fix it.

    Read the article

  • Booting from integrated RAID controller when another RAID controller is installed in a PCIe slot

    - by Antony Scott
    I have a GA MA785GT UD3H motherboard with Windows Server 2008 R2 installed on a RAID1 using the on-board RAID controller. I have now installed a RocketRaid 2680 controller and set up a RAID5 for all my data to be stored on. Unfortunately I now cannot boot from the RAID1 anymore, the PC is trying to boot from the RAID5! Does anyone have any experience of this motherboard / RAID controller combination?

    Read the article

  • Executing Oracle SQLPlus in a Powershell Invoke-Command statement against a remote machine

    - by Scott Muc
    We have a basic powershell script that attempts to execute SQLPlus.exe on a remote machine. The remote does not have Oracle Instant client installed, but we have bundled all the necesary dlls in a remote folder. For example we have sqlplus.exe and dependencies in the directory C:\temp\oracle. If I navigate to that path on the remote server and execute sqlplus.exe it runs just fine. I get the prompt for username. If I go: Invoke-Command -comp remote.machine.host -ScriptBlock { C:\temp\oracle\sqplus.exe } I get the following: Error 57 initializing SQL*Plus + CategoryInfo : NotSpecified: (Error 57 initializing SQL*Plus:String) [], RemoteException + FullyQualifiedErrorId : NativeCommandError Error loading message shared library Thinking that it's potentially a PATH issue I tried the following: Invoke-Command -comp remote.machine.host -ScriptBlock { $env:ORACLE_HOME= "C:\temp\oracle"; $env:PATH = "$env:ORACLE_HOME; C:\temp\oracle\sqlplus.exe } This had the same result. The error code is not very helpful and is extremely frustrating since it does work when I log on to the machine. What is powershell remoting doing that's making this not work?

    Read the article

  • SCVMM 2008 R2 problems migrating VM from VS2005 to Hyper-V host

    - by Scott Ivey
    I have System Center Virtual Machine Manager 2008 R2 installed, and have a Hyper-V R2 host and a Virtual Server 2005 host. I'm trying to migrate my machines from the VS2005 host to the Hyper-V host, and keep getting the following error... VMM is unable to complete the requested file transfer. The connection to the HTTP server myserver.mydomain.local could not be established. (Unknown error (0x80072efd)) Recommended Action Ensure that the HTTP service and/or the agent on the machine myserver.mydomain.local are installed and running and that a firewall is not blocking HTTPS traffic. (Note - migrations between Hyper-V hosts managed by the VMM server work fine - my problem is just going from VS2005-Hyper-V hosts) I have no firewalls turned on on either of the servers, and no firewalls in the middle. I've looked all over for answers to this problem, and am getting nowhere. All the articles I find when searching are talking about either V2V or P2V - and i'm just trying to do a straight migrate VM. I've tried rebooting the boxes, changing the BITS SSL port number, restarting services, triple-checking firewalls, etc. Does anyone have any good suggestions as to how I can resolve this problem?

    Read the article

  • Apachebench on node.js server returning "apr_poll: The timeout specified has expired (70007)" after ~30 requests

    - by Scott
    I just started working with node.js and doing some experimental load testing with ab is returning an error at around 30 requests or so. I've found other pages showing a lot better concurrency numbers than I am such as: http://zgadzaj.com/benchmarking-nodejs-basic-performance-tests-against-apache-php Are there some critical server configuration settings that need done to achieve those numbers? I've watched memory on top and I still see a decent amount of free memory while running ab, watched mongostat as well and not seeing anything that looks suspicious. The command I'm running, and the error is: ab -k -n 100 -c 10 postrockandbeyond.com/ This is ApacheBench, Version 2.0.41-dev <$Revision: 1.121.2.12 $> apache-2.0 Copyright (c) 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/ Copyright (c) 2006 The Apache Software Foundation, http://www.apache.org/ Benchmarking postrockandbeyond.com (be patient)...apr_poll: The timeout specified has expired (70007) Total of 32 requests completed Does anyone have any suggestions on things I should look in to that may be causing this? I'm running it on osx lion, but have also run the same command on the server with the same results. EDIT: I eventually solved this issue. I was using a TTAPI, which was connecting to turntable.fm through websockets. On the homepage, I was connecting on every request. So what was happening was that after a certain number of connections, everything would fall apart. If you're running into the same issue, check out whether you are hitting external services each request.

    Read the article

  • Why isn't ICMP routing with iptables nat routing

    - by Scott Forsyth - MVP
    I'm using iptables on Ubuntu server to route a public IP to a private IP. I want to nat all traffic, including 80, 443 and ICMP. However, it appears that ICMP isn't routing. I have a steady ping going to the public IP and it never stops, even with NAT pointing to a bogus IP. Here are the rules that I'm using: iptables -t nat -I PREROUTING -d 206.72.119.76 -j DNAT --to-destination 10.240.5.5 iptables -t nat -I POSTROUTING -s 10.240.5.5 -j SNAT --to-source 206.72.119.76 I tried with rules for ICMP specifically, but no such luck: iptables -t nat -I PREROUTING -d 206.72.119.76 - icmp --icmp-type echo-request -j DNAT --to-destination 10.240.5.5 Any ideas?

    Read the article

  • Getting SMB file shares working over a PPTP VPN

    - by Ben Scott
    I'm having issues getting SMB file shares working over a PPTP VPN. The server setup consists of a security device (DrayTek V3300) which passes the PPTP authentication to a SBS2003 server running RRAS. The server is the DC and provides DNS and WINS, the single NIC's name server is set to the NIC's IP (192.168...), and DHCP on the DrayTek sets the server IP as the DNS. If I create a new VPN connection in Win7, leaving everything as default apart from the server, username, password and domain, I can: ping everything by IP address resolve IPs with nslookup using their fully-qualified name, as in nslookup fileserver.mydomain.local ping machines by fully-qualified name, as in ping fileserver.mydomain.local However if I try to access a file share: within Explorer, I get "Windows cannot access ..." with "Error code: 0x80004005 Unspecified Error", using net use z: \\fileserver.mydomain.local\share, I get "System error 53 has occurred. The network path was not found." If I add the machine name to my HOSTS file I can use the file share, which is my last-ditch workaround, but I have a number of VPN users and would rather a solution that doesn't involve me trying to hand-edit system files on computers half a country away. If I set the WINS server explicitly in the connection's IPv4 settings I don't have to use the FQN to ping the machine, but that doesn't change anything else. EDIT: The PC I'm having the issue on is running Win 7 Home Premium. After more testing I actually have two other PCs that work, one W7HP, one XP Home, and another Vista PC that doesn't work (not tested as much as the others), all four on the same internet connection (behind the same router). All of them were tested with a straight-forward, all defaults, new VPN configuration.

    Read the article

  • Oops, no RSA or DSA server certificate found for 'server.host.name:0'?

    - by Scott Warren
    I'm setting up a new web server that hosts a dozen virtual hosts on Ubuntu 12.4 using Apache 2.2.22 with one config file per site. I created all the configuration files all at once and ran a2ensite * to enable them all at once. When I reloaded the configuration it failed and after restarting apache I found the following error message in my error.log: Oops, no RSA or DSA server certificate found for 'server.host.name:0'?! Most of the results for this error message are years old that don't fix the problem or are bugs that have been fixed https://issues.apache.org/bugzilla/show_bug.cgi?id=31709

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >