Search Results

Search found 1636 results on 66 pages for 'brian walker'.

Page 65/66 | < Previous Page | 61 62 63 64 65 66  | Next Page >

  • When should I use Perl's AUTOLOAD?

    - by Robert S. Barnes
    In "Perl Best Practices" the very first line in the section on AUTOLOAD is: Don't use AUTOLOAD However all the cases he describes are dealing with OO or Modules. I have a stand alone script in which some command line switches control which versions of particular functions get defined. Now I know I could just take the conditionals and the evals and stick them naked at the top of my file before everything else, but I find it convenient and cleaner to put them in AUTOLOAD at the end of the file. Is this bad practice / style? If you think so why, and is there a another way to do it? As per brian's request I'm basically using this to do conditional compilation based on command line switches. I don't mind some constructive criticism. sub AUTOLOAD { our $AUTOLOAD; (my $method = $AUTOLOAD) =~ s/.*:://s; # remove package name if ($method eq 'tcpdump' && $tcpdump) { eval q( sub tcpdump { my $msg = shift; warn gf_time()." Thread ".threads->tid().": $msg\n"; } ); } elsif ($method eq 'loginfo' && $debug) { eval q( sub loginfo { my $msg = shift; $msg =~ s/$CRLF/\n/g; print gf_time()." Thread ".threads->tid().": $msg\n"; } ); } elsif ($method eq 'build_get') { if ($pipelining) { eval q( sub build_get { my $url = shift; my $base = shift; $url = "http://".$url unless $url =~ /^http/; return "GET $url HTTP/1.1${CRLF}Host: $base$CRLF$CRLF"; } ); } else { eval q( sub build_get { my $url = shift; my $base = shift; $url = "http://".$url unless $url =~ /^http/; return "GET $url HTTP/1.1${CRLF}Host: $base${CRLF}Connection: close$CRLF$CRLF"; } ); } } elsif ($method eq 'grow') { eval q{ require Convert::Scalar qw(grow); }; if ($@) { eval q( sub grow {} ); } goto &$method; } else { eval "sub $method {}"; return; } die $@ if $@; goto &$method; }

    Read the article

  • How can a test script inform R CMD check that it should emit a custom message?

    - by mariotomo
    I'm writing a R package (delftfews) here at office. we are using svUnit for unit testing. our process for describing new functionality: we define new unit tests, initially marked as DEACTIVATED; one block of tests at a time we activate them and implement the function described by the tests. almost all the time we have a small amount of DEACTIVATED tests, relative to functions that might be dropped or will be implemented. my problem/question is: can I alter the doSvUnit.R so that R CMD check pkg emits a NOTE (i.e. a custom message "NOTE" instead of "OK") in case there are DEACTIVATED tests? as of now, we see only that the active tests don't give error: . . * checking for unstated dependencies in tests ... OK * checking tests ... Running ‘doSvUnit.R’ OK * checking PDF version of manual ... OK which is all right if all tests succeed, but less all right if there are skipped tests and definitely wrong if there are failing tests. In this case, I'd actually like to see a NOTE or a WARNING like the following: . . * checking for unstated dependencies in tests ... OK * checking tests ... Running ‘doSvUnit.R’ NOTE 6 test(s) were skipped. WARNING 1 test(s) are failing. * checking PDF version of manual ... OK As of now, we have to open the doSvUnit.Rout to check the real test results. I contacted two of the maintainers at r-forge and CRAN and they pointed me to the sources of R, in particular the testing.R script. if I understand it correctly, to answer this question we need patching the tools package: scripts in the tests directory are called using a system call, output (stdout and stderr) go to one single file, there are two possible outcomes: ok or not ok, so I opened a change request on R, proposing something like bit-coding the return status, bit-0 for ERROR (as it is now), bit-1 for WARNING, bit-2 for NOTE. with my modification, it would be easy producing this output: . . * checking for unstated dependencies in tests ... OK * checking tests ... Running ‘doSvUnit.R’ NOTE - please check doSvUnit.Rout. WARNING - please check doSvUnit.Rout. * checking PDF version of manual ... OK Brian Ripley replied "There are however several packages with properly written unit tests that do signal as required. Please do take this discussion elsewhere: R-bugs is not the place to ask questions." and closed the change request. anybody has hints?

    Read the article

  • jQuery nested sortables jumpy behaviour

    - by sebbie
    I want to allow user to drag and drop UI elements. I've 'container' and 'control', control may be in container, containers may include other containers (this is important requirement). I created simple prototype using jQuery. HTML: <div class="one"> <div class="control">Control 1</div> <div class="control">Control 2</div> <div class="control container"> Container drag area <div class="control">Subcontrol 1</div> <div class="control">Subcontrol 2</div> <div class="control">Subcontrol 3</div> <div class="control">Subcontrol 4</div> <div class="control">Subcontrol 5</div> <div class="control">Subcontrol 6</div> <div class="control">Subcontrol 7</div> <div class="control">Subcontrol 8</div> <div class="control">Subcontrol 9</div> </div> <div class="control">Control 3</div> Then I created sortables using jQueryUI: $('.one').sortable({ items: 'div.control', placeholder: 'placeholder', forcePlaceholderSize: true }); Now when I'm trying to drag "Subcontrol 8" and place it between "Subcontrol 2" and "Subcontrol 3" for example I'm getting jumpy effect, you can observe it here: http://jsbin.com/egipu4/2 Interesting thing is - when I remove ability to drag "container" then it becomes smooth and perfect (you can see this on jsbin example below "jumpy" example, you can't drag using "Container drag area" span). I tried different "nested" plugins and techniques, google'd for a long time and the only one that worked was on this page: (StackOverflow doesn't allow me to post more than one like, google for "Brian Swartzfager's Blog: Nested List Sort Demo" should be first, sorry!) But it does work great only in jQuery1.2 and very old jQueryUI. If I include latest jQuery (1.3/1.4) and UI (1.7/1.8) it gets jumpy as well. What am I doing wrong?

    Read the article

  • Code golf - hex to (raw) binary conversion

    - by Alnitak
    In response to this question asking about hex to (raw) binary conversion, a comment suggested that it could be solved in "5-10 lines of C, or any other language." I'm sure that for (some) scripting languages that could be achieved, and would like to see how. Can we prove that comment true, for C, too? NB: this doesn't mean hex to ASCII binary - specifically the output should be a raw octet stream corresponding to the input ASCII hex. Also, the input parser should skip/ignore white space. edit (by Brian Campbell) May I propose the following rules, for consistency? Feel free to edit or delete these if you don't think these are helpful, but I think that since there has been some discussion of how certain cases should work, some clarification would be helpful. The program must read from stdin and write to stdout (we could also allow reading from and writing to files passed in on the command line, but I can't imagine that would be shorter in any language than stdin and stdout) The program must use only packages included with your base, standard language distribution. In the case of C/C++, this means their respective standard libraries, and not POSIX. The program must compile or run without any special options passed to the compiler or interpreter (so, 'gcc myprog.c' or 'python myprog.py' or 'ruby myprog.rb' are OK, while 'ruby -rscanf myprog.rb' is not allowed; requiring/importing modules counts against your character count). The program should read integer bytes represented by pairs of adjacent hexadecimal digits (upper, lower, or mixed case), optionally separated by whitespace, and write the corresponding bytes to output. Each pair of hexadecimal digits is written with most significant nibble first. The behavior of the program on invalid input (characters besides [a-fA-F \t\r\n], spaces separating the two characters in an individual byte, an odd number of hex digits in the input) is undefined; any behavior (other than actively damaging the user's computer or something) on bad input is acceptable (throwing an error, stopping output, ignoring bad characters, treating a single character as the value of one byte, are all OK) The program may write no additional bytes to output. Code is scored by fewest total bytes in the source file. (Or, if we wanted to be more true to the original challenge, the score would be based on lowest number of lines of code; I would impose an 80 character limit per line in that case, since otherwise you'd get a bunch of ties for 1 line).

    Read the article

  • Android , Read in binary data and write it to file

    - by Shpongle
    Hi all , Im trying to read in image file from a server , with the code below . It keeps going into the exception. I know the correct number of bytes are being sent as I print them out when received. Im sending the image file from python like so #open the image file and read it into an object imgfile = open (marked_image, 'rb') obj = imgfile.read() #get the no of bytes in the image and convert it to a string bytes = str(len(obj)) #send the number of bytes self.conn.send( bytes + '\n') if self.conn.sendall(obj) == None: imgfile.flush() imgfile.close() print 'Image Sent' else: print 'Error' Here is the android part , this is where I'm having the problem. Any suggestions on the best way to go about receiving the image and writing it to a file ? //read the number of bytes in the image String noOfBytes = in.readLine(); Toast.makeText(this, noOfBytes, 5).show(); byte bytes [] = new byte [Integer.parseInt(noOfBytes)]; //create a file to store the retrieved image File photo = new File(Environment.getExternalStorageDirectory(), "PostKey.jpg"); DataInputStream dis = new DataInputStream(link.getInputStream()); try{ os =new FileOutputStream(photo); byte buf[]=new byte[1024]; int len; while((len=dis.read(buf))>0) os.write(buf,0,len); Toast.makeText(this, "File recieved", 5).show(); os.close(); dis.close(); }catch(IOException e){ Toast.makeText(this, "An IO Error Occured", 5).show(); } EDIT: I still cant seem to get it working. I have been at it since and the result of all my efforts have either resulted in a file that is not the full size or else the app crashing. I know the file is not corrupt before sending server side. As far as I can tell its definitely sending too as the send all method in python sends all or throws an exception in the event of an error and so far it has never thrown an exception. So the client side is messed up . I have to send the file from the server so I cant use the suggestion suggested by Brian .

    Read the article

  • Testing Entity Framework applications, pt. 3: NDbUnit

    - by Thomas Weller
    This is the third of a three part series that deals with the issue of faking test data in the context of a legacy app that was built with Microsoft's Entity Framework (EF) on top of an MS SQL Server database – a scenario that can be found very often. Please read the first part for a description of the sample application, a discussion of some general aspects of unit testing in a database context, and of some more specific aspects of the here discussed EF/MSSQL combination. Lately, I wondered how you would ‘mock’ the data layer of a legacy application, when this data layer is made up of an MS Entity Framework (EF) model in combination with a MS SQL Server database. Originally, this question came up in the context of how you could enable higher-level integration tests (automated UI tests, to be exact) for a legacy application that uses this EF/MSSQL combo as its data store mechanism – a not so uncommon scenario. The question sparked my interest, and I decided to dive into it somewhat deeper. What I've found out is, in short, that it's not very easy and straightforward to do it – but it can be done. The two strategies that are best suited to fit the bill involve using either the (commercial) Typemock Isolator tool or the (free) NDbUnit framework. The use of Typemock was discussed in the previous post, this post now will present the NDbUnit approach... NDbUnit is an Apache 2.0-licensed open-source project, and like so many other Nxxx tools and frameworks, it is basically a C#/.NET port of the corresponding Java version (DbUnit namely). In short, it helps you in flexibly managing the state of a database in that it lets you easily perform basic operations (like e.g. Insert, Delete, Refresh, DeleteAll)  against your database and, most notably, lets you feed it with data from external xml files. Let's have a look at how things can be done with the help of this framework. Preparing the test data Compared to Typemock, using NDbUnit implies a totally different approach to meet our testing needs.  So the here described testing scenario requires an instance of an SQL Server database in operation, and it also means that the Entity Framework model that sits on top of this database is completely unaffected. First things first: For its interactions with the database, NDbUnit relies on a .NET Dataset xsd file. See Step 1 of their Quick Start Guide for a description of how to create one. With this prerequisite in place then, the test fixture's setup code could look something like this: [TestFixture, TestsOn(typeof(PersonRepository))] [Metadata("NDbUnit Quickstart URL",           "http://code.google.com/p/ndbunit/wiki/QuickStartGuide")] [Description("Uses the NDbUnit library to provide test data to a local database.")] public class PersonRepositoryFixture {     #region Constants     private const string XmlSchema = @"..\..\TestData\School.xsd";     #endregion // Constants     #region Fields     private SchoolEntities _schoolContext;     private PersonRepository _personRepository;     private INDbUnitTest _database;     #endregion // Fields     #region Setup/TearDown     [FixtureSetUp]     public void FixtureSetUp()     {         var connectionString = ConfigurationManager.ConnectionStrings["School_Test"].ConnectionString;         _database = new SqlDbUnitTest(connectionString);         _database.ReadXmlSchema(XmlSchema);         var entityConnectionStringBuilder = new EntityConnectionStringBuilder         {             Metadata = "res://*/School.csdl|res://*/School.ssdl|res://*/School.msl",             Provider = "System.Data.SqlClient",             ProviderConnectionString = connectionString         };         _schoolContext = new SchoolEntities(entityConnectionStringBuilder.ConnectionString);         _personRepository = new PersonRepository(this._schoolContext);     }     [FixtureTearDown]     public void FixtureTearDown()     {         _database.PerformDbOperation(DbOperationFlag.DeleteAll);         _schoolContext.Dispose();     }     ...  As you can see, there is slightly more fixture setup code involved if your tests are using NDbUnit to provide the test data: Because we're dealing with a physical database instance here, we first need to pick up the test-specific connection string from the test assemblies' App.config, then initialize an NDbUnit helper object with this connection along with the provided xsd file, and also set up the SchoolEntities and the PersonRepository instances accordingly. The _database field (an instance of the INdUnitTest interface) will be our single access point to the underlying database: We use it to perform all the required operations against the data store. To have a flexible mechanism to easily insert data into the database, we can write a helper method like this: private void InsertTestData(params string[] dataFileNames) {     _database.PerformDbOperation(DbOperationFlag.DeleteAll);     if (dataFileNames == null)     {         return;     }     try     {         foreach (string fileName in dataFileNames)         {             if (!File.Exists(fileName))             {                 throw new FileNotFoundException(Path.GetFullPath(fileName));             }             _database.ReadXml(fileName);             _database.PerformDbOperation(DbOperationFlag.InsertIdentity);         }     }     catch     {         _database.PerformDbOperation(DbOperationFlag.DeleteAll);         throw;     } } This lets us easily insert test data from xml files, in any number and in a  controlled order (which is important because we eventually must fulfill referential constraints, or we must account for some other stuff that imposes a specific ordering on data insertion). Again, as with Typemock, I won't go into API details here. - Unfortunately, there isn't too much documentation for NDbUnit anyway, other than the already mentioned Quick Start Guide (and the source code itself, of course) - a not so uncommon problem with smaller Open Source Projects. Last not least, we need to provide the required test data in xml form. A snippet for data from the People table might look like this, for example: <?xml version="1.0" encoding="utf-8" ?> <School xmlns="http://tempuri.org/School.xsd">   <Person>     <PersonID>1</PersonID>     <LastName>Abercrombie</LastName>     <FirstName>Kim</FirstName>     <HireDate>1995-03-11T00:00:00</HireDate>   </Person>   <Person>     <PersonID>2</PersonID>     <LastName>Barzdukas</LastName>     <FirstName>Gytis</FirstName>     <EnrollmentDate>2005-09-01T00:00:00</EnrollmentDate>   </Person>   <Person>     ... You can also have data from various tables in one single xml file, if that's appropriate for you (but beware of the already mentioned ordering issues). It's true that your test assembly may end up with dozens of such xml files, each containing quite a big amount of text data. But because the files are of very low complexity, and with the help of a little bit of Copy/Paste and Excel magic, this appears to be well manageable. Executing some basic tests Here are some of the possible tests that can be written with the above preparations in place: private const string People = @"..\..\TestData\School.People.xml"; ... [Test, MultipleAsserts, TestsOn("PersonRepository.GetNameList")] public void GetNameList_ListOrdering_ReturnsTheExpectedFullNames() {     InsertTestData(People);     List<string> names =         _personRepository.GetNameList(NameOrdering.List);     Assert.Count(34, names);     Assert.AreEqual("Abercrombie, Kim", names.First());     Assert.AreEqual("Zheng, Roger", names.Last()); } [Test, MultipleAsserts, TestsOn("PersonRepository.GetNameList")] [DependsOn("RemovePerson_CalledOnce_DecreasesCountByOne")] public void GetNameList_NormalOrdering_ReturnsTheExpectedFullNames() {     InsertTestData(People);     List<string> names =         _personRepository.GetNameList(NameOrdering.Normal);     Assert.Count(34, names);     Assert.AreEqual("Alexandra Walker", names.First());     Assert.AreEqual("Yan Li", names.Last()); } [Test, TestsOn("PersonRepository.AddPerson")] public void AddPerson_CalledOnce_IncreasesCountByOne() {     InsertTestData(People);     int count = _personRepository.Count;     _personRepository.AddPerson(new Person { FirstName = "Thomas", LastName = "Weller" });     Assert.AreEqual(count + 1, _personRepository.Count); } [Test, TestsOn("PersonRepository.RemovePerson")] public void RemovePerson_CalledOnce_DecreasesCountByOne() {     InsertTestData(People);     int count = _personRepository.Count;     _personRepository.RemovePerson(new Person { PersonID = 33 });     Assert.AreEqual(count - 1, _personRepository.Count); } Not much difference here compared to the corresponding Typemock versions, except that we had to do a bit more preparational work (and also it was harder to get the required knowledge). But this picture changes quite dramatically if we look at some more demanding test cases: Ok, and what if things are becoming somewhat more complex? Tests like the above ones represent the 'easy' scenarios. They may account for the biggest portion of real-world use cases of the application, and they are important to make sure that it is generally sound. But usually, all these nasty little bugs originate from the more complex parts of our code, or they occur when something goes wrong. So, for a testing strategy to be of real practical use, it is especially important to see how easy or difficult it is to mimick a scenario which represents a more complex or exceptional case. The following test, for example, deals with the case that there is some sort of invalid input from the caller: [Test, MultipleAsserts, TestsOn("PersonRepository.GetCourseMembers")] [Row(null, typeof(ArgumentNullException))] [Row("", typeof(ArgumentException))] [Row("NotExistingCourse", typeof(ArgumentException))] public void GetCourseMembers_WithGivenVariousInvalidValues_Throws(string courseTitle, Type expectedInnerExceptionType) {     var exception = Assert.Throws<RepositoryException>(() =>                                 _personRepository.GetCourseMembers(courseTitle));     Assert.IsInstanceOfType(expectedInnerExceptionType, exception.InnerException); } Apparently, this test doesn't need an 'Arrange' part at all (see here for the same test with the Typemock tool). It acts just like any other client code, and all the required business logic comes from the database itself. This doesn't always necessarily mean that there is less complexity, but only that the complexity happens in a different part of your test resources (in the xml files namely, where you sometimes have to spend a lot of effort for carefully preparing the required test data). Another example, which relies on an underlying 1-n relationship, might be this: [Test, MultipleAsserts, TestsOn("PersonRepository.GetCourseMembers")] public void GetCourseMembers_WhenGivenAnExistingCourse_ReturnsListOfStudents() {     InsertTestData(People, Course, Department, StudentGrade);     List<Person> persons = _personRepository.GetCourseMembers("Macroeconomics");     Assert.Count(4, persons);     Assert.ForAll(         persons,         @p => new[] { 10, 11, 12, 14 }.Contains(@p.PersonID),         "Person has none of the expected IDs."); } If you compare this test to its corresponding Typemock version, you immediately see that the test itself is much simpler, easier to read, and thus much more intention-revealing. The complexity here lies hidden behind the call to the InsertTestData() helper method and the content of the used xml files with the test data. And also note that you might have to provide additional data which are not even directly relevant to your test, but are required only to fulfill some integrity needs of the underlying database. Conclusion The first thing to notice when comparing the NDbUnit approach to its Typemock counterpart obviously deals with performance: Of course, NDbUnit is much slower than Typemock. Technically,  it doesn't even make sense to compare the two tools. But practically, it may well play a role and could or could not be an issue, depending on how much tests you have of this kind, how often you run them, and what role they play in your development cycle. Also, because the dataset from the required xsd file must fully match the database schema (even in parts that otherwise wouldn't be relevant to you), it can be quite cumbersome to be in a team where different people are working with the database in parallel. My personal experience is – as already said in the first part – that Typemock gives you a better development experience in a 'dynamic' scenario (when you're working in some kind of TDD-style, you're oftentimes executing the tests from your dev box, and your database schema changes frequently), whereas the NDbUnit approach is a good and solid solution in more 'static' development scenarios (when you need to execute the tests less frequently or only on a separate build server, and/or the underlying database schema can be kept relatively stable), for example some variations of higher-level integration or User-Acceptance tests. But in any case, opening Entity Framework based applications for testing requires a fair amount of resources, planning, and preparational work – it's definitely not the kind of stuff that you would call 'easy to test'. Hopefully, future versions of EF will take testing concerns into account. Otherwise, I don't see too much of a future for the framework in the long run, even though it's quite popular at the moment... The sample solution A sample solution (VS 2010) with the code from this article series is available via my Bitbucket account from here (Bitbucket is a hosting site for Mercurial repositories. The repositories may also be accessed with the Git and Subversion SCMs - consult the documentation for details. In addition, it is possible to download the solution simply as a zipped archive – via the 'get source' button on the very right.). The solution contains some more tests against the PersonRepository class, which are not shown here. Also, it contains database scripts to create and fill the School sample database. To compile and run, the solution expects the Gallio/MbUnit framework to be installed (which is free and can be downloaded from here), the NDbUnit framework (which is also free and can be downloaded from here), and the Typemock Isolator tool (a fully functional 30day-trial is available here). Moreover, you will need an instance of the Microsoft SQL Server DBMS, and you will have to adapt the connection strings in the test projects App.config files accordingly.

    Read the article

  • Solaris 11 Launch Blog Carnival Roundup

    - by constant
    Solaris 11 is here! And together with the official launch activities, a lot of Oracle and non-Oracle bloggers contributed helpful and informative blog articles to help your datacenter go to eleven. Here are some notable blog postings, sorted by category for your Solaris 11 blog-reading pleasure: Getting Started/Overview A lot of people speculated that the official launch of Solaris 11 would be on 11/11 (whatever way you want to turn it), but it actually happened two days earlier. Larry Wake himself offers 11 Reasons Why Oracle Solaris 11 11/11 Isn't Being Released on 11/11/11. Then, Larry goes on with a summary: Oracle Solaris 11: The First Cloud OS gives you a short and sweet rundown of what the major new features of Solaris 11 are. Jeff Victor has his own list of What's New in Oracle Solaris 11. A popular Solaris 11 meme is to write a blog post about 11 favourite features: Jim Laurent's 11 Reasons to Love Solaris 11, Darren Moffat's 11 Favourite Solaris 11 Features, Mike Gerdt's 11 of My Favourite Things! are just three examples of "11 Favourite Things..." type blog posts, I'm sure many more will follow... More official overview content for Solaris 11 is available from the Oracle Tech Network Solaris 11 Portal. Also, check out Rick Ramsey's blog post Solaris 11 Resources for System Administrators on the OTN Blog and his secret 5 Commands That Make Solaris Administration Easier post from the OTN Garage. (Automatic) Installation and the Image Packaging System (IPS) The brand new Image Packaging System (IPS) and the Automatic Installer (IPS), together with numerous other install/packaging/boot/patching features are among the most significant improvements in Solaris 11. But before installing, you may wonder whether Solaris 11 will support your particular set of hardware devices. Again, the OTN Garage comes to the rescue with Rick Ramsey's post How to Find Out Which Devices Are Supported By Solaris 11. Included is a useful guide to all the first steps to get your Solaris 11 system up and running. Tim Foster had a whole handful of blog posts lined up for the launch, teaching you everything you need to know about IPS but didn't dare to ask: The IPS System Repository, IPS Self-assembly - Part 1: Overlays and Part 2: Multiple Packages Delivering Configuration. Watch out for more IPS posts from Tim! If installing packages or upgrading your system from the net makes you uneasy, then you're not alone: Jim Laurent will tech you how Building a Solaris 11 Repository Without Network Connection will make your life easier. Many of you have already peeked into the future by installing Solaris 11 Express. If you're now wondering whether you can upgrade or whether a fresh install is necessary, then check out Alan Hargreaves's post Upgrading Solaris 11 Express b151a with support to Solaris 11. The trick is in upgrading your pkg(1M) first. Networking One of the first things to do after installing Solaris 11 (or any operating system for that matter), is to set it up for networking. Solaris 11 comes with the brand new "Network Auto-Magic" feature which can figure out everything by itself. For those cases where you want to exercise a little more control, Solaris 11 left a few people scratching their heads. Fortunately, Tschokko wrote up this cool blog post: Solaris 11 manual IPv4 & IPv6 configuration right after the launch ceremony. Thanks, Tschokko! And Milek points out a long awaited networking feature in Solaris 11 called Solaris 11 - hostmodel, which I know for a fact that many customers have looked forward to: How to "bind" a Solaris 11 system to a specific gateway for specific IP address it is using. Steffen Weiberle teaches us how to tune the Solaris 11 networking stack the proper way: ipadm(1M). No more fiddling with ndd(1M)! Check out his tutorial on Solaris 11 Network Tunables. And if you want to get even deeper into the networking stack, there's nothing better than DTrace. Alan Maguire teaches you in: DTracing TCP Congestion Control how to probe deeply into the Solaris 11 TCP/IP stack, the TCP congestion control part in particular. Don't miss his other DTrace and TCP related blog posts! DTrace And there we are: DTrace, the king of all observability tools. Long time DTrace veteran and co-author of The DTrace book*, Brendan Gregg blogged about Solaris 11 DTrace syscall provider changes. BTW, after you install Solaris 11, check out the DTrace toolkit which is installed by default in /usr/dtrace/DTT. It is chock full of handy DTrace scripts, many of which contributed by Brendan himself! Security Another big theme in Solaris 11, and one that is crucial for the success of any operating system in the Cloud is Security. Here are some notable posts in this category: Darren Moffat starts by showing us how to completely get rid of root: Completely Disabling Root Logins on Solaris 11. With no root user, there's one major entry point less to worry about. But that's only the start. In Immutable Zones on Encrypted ZFS, Darren shows us how to double the security of your services: First by locking them into the new Immutable Zones feature, then by encrypting their data using the new ZFS encryption feature. And if you're still missing sudo from your Linux days, Darren again has a solution: Password (PAM) caching for Solaris su - "a la sudo". If you're wondering how much compute power all this encryption will cost you, you're in luck: The Solaris X86 AESNI OpenSSL Engine will make sure you'll use your Intel's embedded crypto support to its fullest. And if you own a brand new SPARC T4 machine you're even luckier: It comes with its own SPARC T4 OpenSSL Engine. Dan Anderson's posts show how there really is now excuse not to encrypt any more... Developers Solaris 11 has a lot to offer to developers as well. Ali Bahrami has a series of blog posts that cover diverse developer topics: elffile: ELF Specific File Identification Utility, Using Stub Objects and The Stub Proto: Not Just For Stub Objects Anymore to name a few. BTW, if you're a developer and want to shape the future of Solaris 11, then Vijay Tatkar has a hint for you: Oracle (Sun Systems Group) is hiring! Desktop and Graphics Yes, Solaris 11 is a 100% server OS, but it can also offer a decent desktop environment, especially if you are a developer. Alan Coopersmith starts by discussing S11 X11: ye olde window system in today's new operating system, then Calum Benson shows us around What's new on the Solaris 11 Desktop. Even accessibility is a first-class citizen in the Solaris 11 user interface. Peter Korn celebrates: Accessible Oracle Solaris 11 - released! Performance Gone are the days of "Slowaris", when Solaris was among the few OSes that "did the right thing" while others cut corners just to win benchmarks. Today, Solaris continues doing the right thing, and it delivers the right performance at the same time. Need proof? Check out Brian's BestPerf blog with continuous updates from the benchmarking lab, including Recent Benchmarks Using Oracle Solaris 11! Send Me More Solaris 11 Launch Articles! These are just a few of the more interesting blog articles that came out around the Solaris 11 launch, I'm sure there are many more! Feel free to post a comment below if you find a particularly interesting blog post that hasn't been listed so far and share your enthusiasm for Solaris 11! *Affiliate link: Buy cool stuff and support this blog at no extra cost. We both win! var flattr_uid = '26528'; var flattr_tle = 'Solaris 11 Launch Blog Carnival Roundup'; var flattr_dsc = '<strong>Solaris 11 is here!</strong>And together with the official launch activities, a lot of Oracle and non-Oracle bloggers contributed helpful and informative blog articles to help your datacenter <a href="http://en.wikipedia.org/wiki/Up_to_eleven">go to eleven</a>.Here are some notable blog postings, sorted by category for your Solaris 11 blog-reading pleasure:'; var flattr_tag = 'blogging,digest,Oracle,Solaris,solaris,solaris 11'; var flattr_cat = 'text'; var flattr_url = 'http://constantin.glez.de/blog/2011/11/solaris-11-launch-blog-carnival-roundup'; var flattr_lng = 'en_GB'

    Read the article

  • Why It Is So Important to Know Your Customer

    - by Christie Flanagan
    Over the years, I endured enough delayed flights, air turbulence and misadventures in airport security clearance to watch my expectations for the air travel experience fall to abysmally low levels. The extent of my loyalty to any one carrier had more to do with the proximity of the airport parking garage to their particular gate than to any effort on the airline’s part to actually earn and retain my business. That all changed one day when I found myself at the airport hoping to catch a return flight home a few hours earlier than expected, using an airline I had flown with for the first time just that week.  When you travel regularly for business, being able to catch a return flight home that’s even an hour or two earlier than originally scheduled is a big deal. It can mean the difference between having a normal evening with your family and having to sneak in like a cat burglar after everyone is fast asleep. And so I found myself on this particular day hoping to catch an earlier flight home. I approached the gate agent and was told that I could go on standby for their next flight out. Then I asked how much it was going to cost to change the flight, knowing full well that I wouldn’t get reimbursed by my company for any change fees. “Oh, there’s no charge to fly on standby,” the gate agent told me. I made a funny look. I couldn’t believe what I was hearing. This airline was going to let my fly on standby, at no additional charge, even though I was a new customer with no status or points. It had been years since I’d seen an airline pass up a short term revenue generating opportunity in favor of a long term loyalty generating one.  At that moment, this particular airline gained my loyal business. Since then, this airline has had the opportunity to learn a lot about me. They know where I live, where I fly from, where I usually fly to, and where I like to sit on the plane. In general, I’ve found their customer service to be quite good whether at the airport, via call center and even through social channels. They email me occasionally, and when they do, they demonstrate that they know me by promoting deals for flights from where I live to places that I’d be interested in visiting. And that’s part of why I’m always so puzzled when I visit their website.Does this company with the great service, customer friendly policies, and clean planes demonstrate that they know me at all when I visit their website? The answer is no. Even when I log in using my loyalty program credentials, it’s pretty obvious that they’re presenting the same old home page and same old offers to every single one of their site visitors. I mean, those promotional offers that they’re featuring so prominently  -- they’re for flights that originate thousands of miles from where I live! There’s no way I’d ever book one of those flights and I’m sure I’m not the only one of their customers to feel that way.My reason for recounting this story is not to pick on the one customer experience flaw I've noticed with this particular airline, in fact, they do so many things right that I’ll continue to fly with them. But I did want to illustrate just how glaringly obvious it is to customers today when a touch point they have with a brand is impersonal, unconnected and out of sync. As someone who’s spent a number of years in the web experience management and online marketing space, it particularly peeves me when that out of sync touch point is a brand’s website, perhaps because I know how important it is to make a customer’s online experience relevant and how many powerful tools are available for making a relevant experience a reality. The fact is, delivering a one-size-fits-all online customer experience is no longer acceptable or particularly effective in today’s world. Today’s savvy customers expect you to know who they are and to understand their preferences, behavior and relationship with your brand. Not only do they expect you to know about them, but they also expect you to demonstrate this knowledge across all of their touch points with your brand in a consistent and compelling fashion, whether it be on your traditional website, your mobile web presence or through various social channels.Delivering the kind of personalized online experiences that customers want can have tremendous business benefits. This is not just about generating feelings of goodwill and higher customer satisfaction ratings either. More relevant and personalized online experiences boost the effectiveness of online marketing initiatives and the statistics prove this out. Personalized web experiences can help increase online conversion rates by 70% -- that’s a huge number.1  And more than three quarters of consumers indicate that they’ve made additional online purchases based on personalized product recommendations.2Now if only this airline would get on board with delivering a more personalized online customer experience. I’d certainly be happier and more likely to spring for one of their promotional offers. And by targeting relevant offers on their home page to appropriate segments of their site visitors, I bet they’d be happier and generating additional revenue too. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}  ***** If you're interested in hearing more perspectives on the benefits of demonstrating that you know your customers by delivering a more personalized experience, check out this white paper on creating a successful and meaningful customer experience on the web.  Also catch the video below on the business value of CX in attracting new customers featuring Oracle's VP of Customer Experience Strategy, Brian Curran. 1 Search Engine Watch 2 Marketing Charts

    Read the article

  • MIX 2010 Covert Operations Day 2 Silverlight + Windows 7 Phone

    - by GeekAgilistMercenary
    Left the Circus Circus and headed to the geek circus at Mandalay Bay.  Got in, got some breakfast, met a few more people and headed to the keynote. Upon arriving the crew I was hanging with at the event; Erik Mork, Beth Murray, and Brian Henderson and I were entertained with several other thousand geeks by the wicked yo-yoing. The first video demo of something was of Bing Maps and various aspects of Microsoft Research integrated together.  Namely the pictures, put in place, on real 3d element maps of various environments. Silverlight Scott Guthrie, as one would guess, kicked off the keynote.  His first point was that user experience has become a priority at Microsoft.  This can be seen by any observant soul with the release and push of Expression, Silverlight, and the other tools.  This is even more apparent when one takes note of Microsoft bringing in people that can actually do good design and putting them at the forefront. The next thing Scott brought up was a few key points about Silverlight.  Currently Silverlight is a little over 2 years old and has achieved a pretty solid 60% penetration.  Silverlight has all sorts of capabilities that have been developed and are now provided as open source including;  ad injection, smoothing, playback editing, and more.  Another thing he showed, which really struck me as awesome being in the analytics space, was the Olympics and a quick glimpse of the ad statistics, viewer experience, video playback performance, audience trends, and overall viewer participation.  All of it rendered in Silverlight in beautiful detail. The key piece of Scott's various points were all punctuated with the fact that all of this code is available as open source.  Not only is Microsoft really delving into this design element of things, they're getting involved in the right ways. One of the last points I'll bring up about Silverlight 4 is the ability to have HD video on a monitor, and an entirely different activity being done on the other monitor, effectively making Silverlight the only RIA framework that supports multi-monitor support.  Overall, Silverlight is continuing to impress – providing superior capabilities tit-for-tat with the competition. Windows 7 Phone The Windows 7 Phone has 3 primary buttons (yes, more than the iPhone, don't let your mind explode!!).  Start, Search, and Back control all of the needed functionality of the phone.  At the same time, of course, there is the multi-touch, touch, and other interactive abilities of the interface.  The intent, once start is pressed is to have all the information that a phone owner wants displayed immediately.  Avoiding the scrolling through pages of apps or rolling a ball to get through multitudes of other non-interactive phone interfaces.  The Windows 7 Phone simply has the data right in front of you, basically a phone dashboard.  From there it is easy to dive into the interactive areas of the phone. Each area of the interface of the phone is broken into hubs.  These hubs include applications, data, and other things based on a relative basis.  This basis being determined by the user.  These applications interact on many other levels, and form a kind of relationship between each other adding more and more meta-data to the phone user, their interactions between the applications, and of course the social element of their interactions on the phone.  This makes this phone a practical must have for a marketer involved in social media.  The level of wired together interaction is massive, and of course, if you've seen Office Outlook 2010 you know that the power that is pulled into the phone by being tied to Outlook is massive. Joe Belfiore also showed several UI & specifically UX elements of the phone interface that allows paging to be instinctual by simple clipped items, flipping page to page, and other excellent user experience advances for phone devices.  Belfiore's also showed how his people hub had a massive list of people, with pictures, all from various different social networks and other associated relations.  The rendering, speed, and viewing of these people's, their pictures, their social network information, and other characteristics was smooth and in some situations unbelievably rendered.  This demo showed some of the great power of the beta phone, which isn't even as powerful as the planned end device. Joe finished up by jumping into the music, videos, and other media with the Zune Component of the Windows 7 Mobile Phone.  This was all good stuff, but I'll get to what really sold me on the media element in a moment. When Joe was done, Scott Guthrie stepped back up to walk through building a Windows 7 Mobile Phone.  This is were I have to give serious props.  He built this application, in Visual Studio 2010, in front of 2000+ people.  That was cool, but what really was amazing that he build the application in about 2 minutes.  The IDE, side by side design that is standard in Visual Studio is light years ahead of x-Code or any of the iPhone IDEs.  The Windows 7 Mobile System, if it can get market penetration, poses a technologically superior development and phone platform over anything on the market right now.  The biggest problem with the phone, is it just isn't available yet.  I personally can't wait for a chance to build some apps for the new Windows Phone. Netflix, I May Start Up an Account Again! When I get my Windows 7 Phone device, I am absolutely getting a Netflix account again.  The Vertigo crew, as I wrote on Twitter "#MIX10 Props @seesharp on @netflix demo", displayed an application on the phone for Netflix that actually ran HD Video of Rescue Me (with Dennis Leary).  The video played back smooth as it would on a dedicated computer, I was instantly sold.  So this didn't actually sell me on the phone, because I'm already sold, but it did sell me whole heartedly on the media capabilities of the pending phone. Anyway, I try not to do this but I may double post today.  Lunch is over and I'm off to another session very near and dear to the heart of my occupation, Analytics Tracking.  Stay tuned and I should have that post up by the end of the day. Original Post – Check out my other blog for even more technical ramblings and reads.

    Read the article

  • Advanced Continuous Delivery to Azure from TFS, Part 1: Good Enough Is Not Great

    - by jasont
    The folks over on the TFS / Visual Studio team have been working hard at releasing a steady stream of new features for their new hosted Team Foundation Service in the cloud. One of the most significant features released was simple continuous delivery of your solution into your Azure deployments. The original announcement from Brian Harry can be found here. Team Foundation Service is a great platform for .Net developers who are used to working with TFS on-premises. I’ve been using it since it became available at the //BUILD conference in 2011, and when I recently came to work at Stackify, it was one of the first changes I made. Managing work items is much easier than the tool we were using previously, although there are some limitations (more on that in another blog post). However, when continuous deployment was made available, it blew my mind. It was the killer feature I didn’t know I needed. Not to say that I wasn’t previously an advocate for continuous delivery; just that it was always a pain to set up and configure. Having it hosted - and a one-click setup – well, that’s just the best thing since sliced bread. It made perfect sense: my source code is in the cloud, and my deployment is in the cloud. Great! I can queue up a build from my iPad or phone and just let it go! I quickly tore through the quick setup and saw it all work… sort of. This will be the first in a three part series on how to take the building block of Team Foundation Service continuous delivery and build a CD model that will actually work for any team deploying something more advanced than a “Hello World” example. Part 1: Good Enough Is Not Great Part 2: A Model That Works: Branching and Multiple Deployment Environments Part 3: Other Considerations: SQL, Custom Tasks, Etc Good Enough Is Not Great There. I’ve said it. I certainly hope no one on the TFS team is offended, but it’s the truth. Let’s take a look under the hood and understand how it works, and then why it’s not enough to handle real world CD as-is. How it works. (note that I’ve skipped a couple of steps; I already have my accounts set up and something deployed to Azure) The first step is to establish some oAuth magic between your Azure management portal and your TFS Instance. You do this via the management portal. Once it’s done, you have a new build process template in your TFS instance. (Image lifted from the documentation) From here, you’ll get the usual prompts for security, allowing access, etc. But you’ll also get to pick which Solution in your source control to build. Here’s what the bulk of the build definition looks like. All I’ve had to do is add in the solution to build (notice that mine is from a specific branch – Release – more on that later) and I’ve changed the configuration. I trigger the build, and voila! I have an Azure deployment a few minutes later. The beauty of this is that it’s all in the cloud and I’m not waiting for my machine to compile and upload the package. (I also had to enable the build definition first – by default it is created in disabled state, probably a good thing since it will trigger on every.single.checkin by default.) I get to see a history of deployments from the Azure portal, and can link into TFS to see the associated changesets and work items. You’ll notice also that this build definition also automatically put my code in the Staging slot of my Azure deployment – more on this soon. For now, I can VIP swap and be in production. (P.S. I hate VIP swap and “production” and “staging” in Azure. More on that later too.) That’s it. That’s the default out-of-box experience. Easy, right? But it’s full of room for improvement, so let’s get into that….   The Problems Nothing is perfect (except my code – it’s always perfect), and neither is Continuous Deployment without a bit of work to help it fit your dev team’s process. So what are the issues? Issue 1: Staging vs QA vs Prod vs whatever other environments your team may have. This, for me, is the big hairy one. Remember how this automatically deployed to staging rather than prod for us? There are a couple of issues with this model: If I want to deliver to prod, it requires intervention on my part after deployment (via a VIP swap). If I truly want to promote between environments (i.e. Nightly Build –> Stable QA –> Production) I likely have configuration changes between each environment such as database connection strings and this process (and the VIP swap) doesn’t account for this. Yet. Issue 2: Branching and delivering on every check-in. As I mentioned above, I have set this up to target a specific branch – Release – of my code. For the purposes of this example, I have adopted the “basic” branching strategy as defined by the ALM Rangers. This basically establishes a “Main” trunk where you branch off Dev and Release branches. Granted, the Release branch is usually the only thing you will deploy to production, but you certainly don’t want to roll to production automatically when you merge to the Release branch and check-in (unless you like the thrill of it, and in that case, I like your style, cowboy….). Rather, you have nightly build and QA environments, or if you’ve adopted the feature-branch model you have environments for those. Those are the environments you want to continuously deploy to. But that takes us back to Issue 1: we currently have a 1:1 solution to Azure deployment target. Issue 3: SQL and other custom tasks. Let’s be honest and address the elephant in the room: I need to get some sleep because I see an elephant in the room. But seriously, I can’t think of an application I have touched in the last 10 years that doesn’t need to consider SQL changes when deploying code and upgrading an environment. Microsoft seems perfectly content to ignore this elephant for now: yes, they’ve added Data Tier Applications. But let’s be honest with ourselves again: no one really uses it, and it’s not suitable for anything more complex than a Hello World sample project database. Why? Because it doesn’t fit well into a great source control story. Developers make stored procedure and table changes all day long while coding complex applications, and if someone forgets to go update the DACPAC before the automated deployment, you have a broken build until it’s completed. Developers – not just DBAs – also like to work with SQL in SQL tools, not in Visual Studio. I’m really picking on SQL because that’s generally the biggest concern that I hear. But we need to account for any custom tasks as well in the build process.   The Solutions… ? We’ve taken a look at how this all works, and addressed the shortcomings. In my next post (which I promise will be very, very soon), I will detail how I’ve overcome these shortcomings and used this foundation to create a mature, flexible model for deploying my app – any version, any time, to any environment.

    Read the article

  • PASS: Election Changes for 2011

    - by Bill Graziano
    Last year after the election, the PASS Board created an Election Review Committee.  This group was charged with reviewing our election procedures and making suggestions to improve the process.  You can read about the formation of the group and review some of the intermediate work on the site – especially in the forums. I was one of the members of the group along with Joe Webb (Chair), Lori Edwards, Brian Kelley, Wendy Pastrick, Andy Warren and Allen White.  This group worked from October to April on our election process.  Along the way we: Interviewed interested parties including former NomCom members, Board candidates and anyone else that came forward. Held a session at the Summit to allow interested parties to discuss the issues Had numerous conference calls and worked through the various topics I can’t thank these people enough for the work they did.  They invested a tremendous number of hours thinking, talking and writing about our elections.  I’m proud to say I was a member of this group and thoroughly enjoyed working with everyone (even if I did finally get tired of all the calls.) The ERC delivered their recommendations to the PASS Board prior to our May Board meeting.  We reviewed those and made a few modifications.  I took their recommendations and rewrote them as procedures while incorporating those changes.  Their original recommendations as well as our final document are posted at the ERC documents page.  Please take a second and read them BEFORE we start the elections.  If you have any questions please post them in the forums on the ERC site. (My final document includes a change log at the end that I decided to leave in.  If you want to know which areas to pay special attention to that’s a good start.) Many of those recommendations were already posted in the forums or in the blogs of individual ERC members.  Hopefully nothing in the ERC document is too surprising. In this post I’m going to walk through some of the key changes and talk about what I remember from both ERC and Board discussions.  I’ll pay a little extra attention to things the Board changed from the ERC.  I’d also encourage any of the Board or ERC members to blog their thoughts on this. The Nominating Committee will continue to exist.  Personally, I was curious to see what the non-Board ERC members would think about the NomCom.  There was broad agreement that a group to vet candidates had value to the organization. The NomCom will be composed of five members.  Two will be Board members and three will be from the membership at large.  The only requirement for the three community members is that you’ve volunteered in some way (and volunteering is defined very broadly).  We expect potential at-large NomCom members to participate in a forum on the PASS site to answer questions from the other PASS members. We’re going to hold an election to determine the three community members.  It will be closer to voting for Summit sessions than voting for Board members.  That means there won’t be multiple dedicated emails.  If you’re at all paying attention it will be easy to participate.  Personally I wanted it easy for those that cared to participate but not overwhelm those that didn’t care.  I think this strikes a good balance. There’s also a clause that in order to be considered a winner in this NomCom election, you must receive 10 votes.  This is something I suggested.  I have no idea how popular the NomCom election is going to be.  I just wanted a fallback that if no one participated and some random person got in with one or two votes.  Any open slots will be filled by the NomCom chair (usually the PASS Immediate Past President).  My assumption is that they would probably take the next highest vote getters unless they were throwing flames in the forums or clearly unqualified.  As a final check, the Board still approves the final NomCom. The NomCom is going to rank candidates instead of rating them.  This has interesting implications.  This was championed by another ERC member and I’m hoping they write something about it.  This will really force the NomCom to make decisions between candidates.  You can’t just rate everyone a 3 and be done with it.  It may also make candidates appear further apart than they actually are.  I’m looking forward talking with the NomCom after this election and getting their feedback on this. The PASS Board added an option to remove a candidate with a unanimous vote of the NomCom.  This was primarily put in place to handle people that lied on their application or had a criminal background or some other unusual situation and we figured it out. We list an explicit goal of three candidate per open slot. We also wanted an easy way to find the NomCom candidate rankings from the ballot.  Hopefully this will satisfy those that want a broad candidate pool and those that want the NomCom to identify the most qualified candidates. The primary spokesperson for the NomCom is the committee chair.  After the issues around the election last year we didn’t have a good communication plan in place.  We should have and that was a failure on the part of the Board.  If there is criticism of the election this year I hope that falls squarely on the Board.  The community members of the NomCom shouldn’t be fielding complaints over the election process.  That said, the NomCom is ranking candidates and we are forcing them to rank some lower than others.  I’m sure you’ll each find someone that you think should have been ranked differently.  I also want to highlight one other change to the process that we started last year and isn’t included in these documents.  I think the candidate forums on the PASS site were tremendously helpful last year in helping people to find out more about candidates.  That gives our members a way to ask hard questions of the candidates and publicly see their answers. This year we have two important groups to fill.  The first is the NomCom.  We need three people from our membership to step up and fill this role.  It won’t be easy.  You will have to make subjective rankings of your fellow community members.  Your actions will be important in deciding who the future leaders of PASS will be.  There’s a 50/50 chance that one of the people you interview will be the President of PASS someday.  This is not a responsibility to be taken lightly. The second is the slate of candidates.  If you’ve ever thought about running for the Board this is the year.  We’ve never had nine candidates on the ballot before.  Your chance of making it through the NomCom are higher than in any previous year.  Unfortunately the more of you that run, the more of you that will lose in the election.  And hopefully that competition will mean more community involvement and better Board members for PASS. Is this the end of changes to the election process?  It isn’t.  Every year that I’ve been on the Board the election process has changed.  Some years there have been small changes and some years there have been large changes.  After this election we’ll look at how the process worked and decide what steps to take – just like we do every year.

    Read the article

  • Thou shalt not put code on a piedestal - Code is a tool, no more, no less

    - by Ralf Westphal
    “Write great code and everything else becomes easier” is what Paul Pagel believes in. That´s his version of an adage by Brian Marick he cites: “treat code as an end, not just a means.” And he concludes: “My post-Agile world is software craftsmanship.” I wonder, if that´s really the way to go. Will “simply” writing great code lead the software industry into the light? He´s alluding to the philosopher Kant who proposed, a human beings should never be treated as a means, but always as an end. But should we transfer this ethical statement into the world of software? I doubt it.   Reason #1: Human beings are categorially different from code. They are autonomous entities who need to find a way of living happily together. To Kant it seemed this goal could only be reached if nobody (ab)used a human being for his/her purposes. Because using a human being, i.e. treating it as a means, would contradict the fundamental autonomy and freedom of human beings. People should hold up a symmetric view of their relationships: Since nobody wants to be (ab)used, nobody should (ab)use anybody else. If you want to be treated decently, with respect, in accordance with your own free will - which means as an end - then do the same to other people. Code is dead, it´s a product, it´s a tool for people to reach their goals. No company spends any money on code other than to save money or earn money in the long run. Code is not a puppy. Enterprises do not commission software development to just feel good in its company. Code is not a buddy. Code is a slave, if you will. A mechanical slave, a non-tangible robot. Code is a tool, is a tool. And if we start to treat it differently, if we elevate its status unduely… I guess that will contort our relationship in a contraproductive way. Please get me right: Just because something is “just a tool”, “just a product” does not mean we should not be careful while designing, building, using it. Right to the contrary. We should be very careful when writing code – but not for the code´s sake! We should be careful because we respect our customers who are fellow human beings who should be treated as an end. If we are careless, neglectful, ignorant when producing code on their behalf, then we´re using them. Being sloppy means you´re caring more for yourself that for your customer. You´re then treating the customer as a means to fulfill some of your own needs. That´s plain unethical behavior.   Reason #2: The focus should always be on your purpose, not on any tool. But if code is treated as an end, then the focus is on the code. That might sound right, because where else should be your focus as a software developer? But, well, I´d say, your focus should be on delivering value to your customer. Because in the end your customer does not care if you write a single line of code. She just wants her problem to be solved. Solving problems is the purpose of any contractor. Code must be treated just as a means, a tool we know how to handle very well. But if we´re really trying to be craftsmen then we should be conscious about exactly that and act ethically. That means we must never be so focused on our tool as to be unable to suggest better solutions to the problems of our customers than code.   I´m all with Paul when he urges us to “Write great code”. Sure, if you need to write code, then by all means do so. Write the best code you can think of – and then try to improve it. Paul has all the best intentions when he signs Brians “treat code as an end” - but as we all know: “The road to hell is paved with best intentions” ;-) Yes, I can imagine a “hell of code focus”. In fact, I don´t need to imagine it, I´m seeing it quite often. Because code hell is whereever two developers stand together and are so immersed in talking about all sorts of coding tricks, design patterns, code smells, technologies, platforms, tools that they lose sight of the big picture. Talking about TDD or SOLID or refactoring is a sign of consciousness – relative to the “cowboy coders” view of the world. But from yet another point of view TDD, SOLID, and refactoring are just cures for ailments within a system. And I fear, if “Writing great code” is the only focus or the main focus of software development, then we as an industry lose the ability to see that. Focus draws a line around something, it defines a horizon for perceptions and thinking. So if we focus on code our horizon ends where “the land of code” ends. I don´t think that should be our professional attitude.   So what about Software Craftsmanship as the next big thing after Agility? I think Software Craftsmanship has an important message for all software developers and beyond. But to make it the successor of the Agility movement seems to miss a point. Agility never claimed to solve all software development problems, I´d say. So to blame it for having missed out on certain aspects of it is wrong. If I had to summarize Agility in one word I´d say “Value”. Agility put value for the customer back in software development. Focus on delivering value early and often – that´s Agility´s mantra. All else follows from that. And I ask you: Is that obsolete? Is delivering value not hip anymore? No, sure not. That´s our very purpose as software developers. So how can Agility become obsolete and need to be replaced? We need to do away with this “either/or”-thinking. It´s either Agility or Lean or Software Craftsmanship or whatnot. Instead we should start integrating concepts and movements. Think “both/and”. Think Agility plus Software Craftsmanship plus Lean plus whatnot. We don´t neet to tear down anything from a piedestal and replace it with a new idol. Instead we should do away with piedestals and arrange whatever is helpful is a circle. Then we can turn to concepts, movements for whatever they are best. After 10 years of Agility we should be able to identify what it was good at – and keep that. Keep Agility around and add whatever Agility was lacking or never concerned with. Add whatever is at the core of Software Craftsmanship. Add whatever is at the core of Lean etc. But don´t call out the age of Post-Agility. Because it better never will end. Because once we start to lose Agility´s core we´re losing focus of the customer.

    Read the article

  • how to put result in the end using jquery

    - by Martty Rox
    my code is : html with the js in section please ch3eck it out i need to prodeuce the result of the form of five questions and display it in the last section of the page where am i going wrong am using innerhtml js thing to produce the result in the last section <div id="section1"> <script type="text/javascript"> function changeText2(){ alert("working"); var count1=0; var a=document.forms["myForm"]["drop1"].value; var b=document.forms["myForm"]["drop2"].value; alert(document.forms["myForm"]["drop2"].value); var c=document.forms["myForm"]["drop3"].value; var d=document.forms["myForm"]["drop4"].value; var e=document.forms["myForm"]["drop5"].value; var f=document.forms["myForm"]["drop6"].value; if(a===2) { count1++; alert(count1); } else { alert("lit"); } if(b===2) { count1++; } else { alert("lit"); } if(c===2) { count1++; } else { alert("lit"); } if(d===2) { count1++; } else { alert("lit"); } if(e===2) { count1++; } else alert("lit"); } alert(count1); document.getElementById('boldStuff2').innerHTML = count1; </script> <form name="myForm"> <p>1)&#x00A0;&#x00A0;Who won the 1993 &#x201C;King of the Ring&#x201D;?</p> <div> <select id="f1" name="drop1"> <option value="0" selected="selected">-- Select -- </option> <option value="1">Owen Hart </option> <option value="2">Bret Hart </option> <option value="3">Edge </option> <option value="4">Mabel </option> </select> </div> <!--que1--> <p>2)&#x00A0;&#x00A0;What NHL goaltender has the most career wins?</p> <div> <select id="f2" name="drop2"> <option value="0" selected="selected">-- Select -- </option> <option value="1">Grant Fuhr </option> <option value="2">Patrick Roy </option> <option value="3">Chris Osgood </option> <option value="4">Mike Vernon</option> </select> </div> <!--que2--> <p>3)&#x00A0;&#x00A0;What Major League Baseball player holds the record for all-time career high batting average?</p> <div> <select id="f3" name="drop3"> <option value="0" selected="selected">-- Select -- </option> <option value="1">Ty Cobb </option> <option value="2">Larry Walker </option> <option value="3">Jeff Bagwell </option> <option value="4">Frank Thomas</option> </select> </div> <!--que3--> <p>4)&#x00A0;&#x00A0;Who among the following is NOT associated with billiards in India?</p> <div> <select id="f4" name="drop4" > <option value="0" selected="selected">-- Select -- </option> <option value="1">Subash Agrawal </option> <option value="2">Ashok Shandilya </option> <option value="3">Manoj Kothari </option> <option value="4">Mihir Sen</option> </select> </div> <!--que4--> <p>5)&#x00A0;&#x00A0;Which cricketer died on the field in Bangladesh while playing for Abahani Club?</p> <div> <select id="f5" name="drop5" > <option value="0" selected="selected">-- Select -- </option> <option value="1">Subhash Gupte</option> <option value="2">M.L.Jaisimha</option> <option value="3">Lala Amarnath</option> <option value="4">Raman Lamba</option> </select> </div> <!--que5--> <a href="#services" class="page_nav_btn next"><input type='button' onclick='changeText2()' value='NEXT'/></a> </form> </div> <div id="section2"> </div> ... <div id="results"> <b id='boldStuff2'>fff ggg</b> </div> need to display the results of each section at the last div as shown in script... js for first section not working plz some help me where am i going wrong....

    Read the article

  • Upgrading from TFS 2010 RC to TFS 2010 RTM done

    - by Martin Hinshelwood
    Today is the big day, with the Launch of Visual Studio 2010 already done in Asia, and rolling around the world towards us, we are getting ready for the RTM (Released). We have had TFS 2010 in Production for nearly 6 months and have had only minimal problems. Update 12th April 2010  – Added Scott Hanselman’s tweet about the MSDN download release time. SSW was the first company in the world outside of Microsoft to deploy Visual Studio 2010 Team Foundation Server to production, not once, but twice. I am hoping to make it 3 in a row, but with all the hype around the new version, and with it being a production release and not just a go-live, I think there will be a lot of competition. Developers: MSDN will be updated with #vs2010 downloads and details at 10am PST *today*! @shanselman - Scott Hanselman Same as before, we need to Uninstall 2010 RC and install 2010 RTM. The installer will take care of all the complexity of actually upgrading any schema changes. If you are upgrading from TFS 2008 to TFS2010 you can follow our Rules To Better TFS 2010 Migration and read my post on our successes.   We run TFS 2010 in a Hyper-V virtual environment, so we have the advantage of running a snapshot as well as taking a DB backup. Done - Snapshot the hyper-v server Microsoft does not support taking a snapshot of a running server, for very good reason, and Brian Harry wrote a post after my last upgrade with the reason why you should never snapshot a running server. Done - Uninstall Visual Studio Team Explorer 2010 RC You will need to uninstall all of the Visual Studio 2010 RC client bits that you have on the server. Done - Uninstall TFS 2010 RC Done - Install TFS 2010 RTM Done - Configure TFS 2010 RTM Pick the Upgrade option and point it at your existing “tfs_Configuration” database to load all of the existing settings Done - Upgrade the SharePoint Extensions Upgrade Build Servers (Pending) Test the server The back out plan, and you should always have one, is to restore the snapshot. Upgrading to Team Foundation Server 2010 – Done The first thing you need to do is off the TFS server and then log into the Hyper-v server and create a snapshot. Figure: Make sure you turn the server off and delete all old snapshots before you take a new one I noticed that the snapshot that was taken before the Beta 2 to RC upgrade was still there. You should really delete old snapshots before you create a new one, but in this case the SysAdmin (who is currently tucked up in bed) asked me not to. I guess he is worried about a developer messing up his server Turn your server on and wait for it to boot in anticipation of all the nice shiny RTM’ness that is coming next. The upgrade procedure for TFS2010 is to uninstal the old version and install the new one. Figure: Remove Visual Studio 2010 Team Foundation Server RC from the system.   Figure: Most of the heavy lifting is done by the Uninstaller, but make sure you have removed any of the client bits first. Specifically Visual Studio 2010 or Team Explorer 2010.  Once the uninstall is complete, this took around 5 minutes for me, you can begin the install of the RTM. Running the 64 bit OS will allow the application to use more than 2GB RAM, which while not common may be of use in heavy load situations. Figure: It is always recommended to install the 64bit version of a server application where possible. I do not think it is likely, with SharePoint 2010 and Exchange 2010  and even Windows Server 2008 R2 being 64 bit only, I do not think there will be another release of a server app that is 32bit. You then need to choose what it is you want to install. This depends on how you are running TFS and on how many servers. In our case we run TFS and the Team Foundation Build Service (controller only) on out TFS server along with Analysis services and Reporting Services. But our SharePoint server lives elsewhere. Figure: This always confuses people, but in reality it makes sense. Don’t install what you do not need. Every extra you install has an impact of performance. If you are integrating with SharePoint you will need to run this install on every Front end server in your farm and don’t forget to upgrade your Build servers and proxy servers later. Figure: Selecting only Team Foundation Server (TFS) and Team Foundation Build Services (TFBS)   It is worth noting that if you have a lot of builds kicking off, and hence a lot of get operations against your TFS server, you can use a proxy server to cache the source control on another server in between your TFS server and your build servers. Figure: Installing Microsoft .NET Framework 4 takes the most time. Figure: Now run Windows Update, and SSW Diagnostic to make sure all your bits and bobs are up to date. Note: SSW Diagnostic will check your Power Tools, Add-on’s, Check in Policies and other bits as well. Configure Team Foundation Server 2010 – Done Now you can configure the server. If you have no key you will need to pick “Install a Trial Licence”, but it is only £500, or free with a MSDN subscription. Anyway, if you pick Trial you get 90 days to get your key. Figure: You can pick trial and add your key later using the TFS Server Admin. Here is where the real choices happen. We are doing an Upgrade from a previous version, so I will pick Upgrade the same as all you folks that are using the RC or TFS 2008. Figure: The upgrade wizard takes your existing 2010 or 2008 databases and upgraded them to the release.   Once you have entered your database server name you can click “List available databases” and it will show what it can upgrade. Figure: Select your database from the list and at this point, make sure you have a valid backup. At this point you have not made ANY changes to the databases. At this point the configuration wizard will load configuration from your existing database if you have one. If you are upgrading TFS 2008 refer to Rules To Better TFS 2010 Migration. Mostly during the wizard the default values will suffice, but depending on the configuration you want you can pick different options. Figure: Set the application tier account and Authentication method to use. We use NTLM to keep things simple as we host our TFS server externally for our remote developers.  Figure: Setting your TFS server URL’s to be the remote URL’s allows the reports to be accessed without using VPN. Very handy for those remote developers. Figure: Detected the existing Warehouse no problem. Figure: Again we love green ticks. It gives us a warm fuzzy feeling. Figure: The username for connecting to Reporting services should be a domain account (if you are on a domain that is). Figure: Setup the SharePoint integration to connect to your external SharePoint server. You can take the option to connect later.   You then need to run all of your readiness checks. These check can save your life! it will check all of the settings that you have entered as well as checking all the external services are configures and running properly. There are two reasons that TFS 2010 is so easy and painless to install where previous version were not. Microsoft changes the install to two steps, Install and configuration. The second reason is that they have pulled out all of the stops in making the install run all the checks necessary to make sure that once you start the install that it will complete. if you find any errors I recommend that you report them on http://connect.microsoft.com so everyone can benefit from your misery.   Figure: Now we have everything setup the configuration wizard can do its work.  Figure: Took a while on the “Web site” stage for some point, but zipped though after that.  Figure: last wee bit. TFS Needs to do a little tinkering with the data to complete the upgrade. Figure: All upgraded. I am not worried about the yellow triangle as SharePoint was being a little silly Exception Message: TF254021: The account name or password that you specified is not valid. (type TfsAdminException) Exception Stack Trace:    at Microsoft.TeamFoundation.Management.Controls.WizardCommon.AccountSelectionControl.TestLogon(String connectionString)    at System.ComponentModel.BackgroundWorker.WorkerThreadStart(Object argument) [Info   @16:10:16.307] Benign exception caught as part of verify: Exception Message: TF255329: The following site could not be accessed: http://projects.ssw.com.au/. The server that you specified did not return the expected response. Either you have not installed the Team Foundation Server Extensions for SharePoint Products on this server, or a firewall is blocking access to the specified site or the SharePoint Central Administration site. For more information, see the Microsoft Web site (http://go.microsoft.com/fwlink/?LinkId=161206). (type TeamFoundationServerException) Exception Stack Trace:    at Microsoft.TeamFoundation.Client.SharePoint.WssUtilities.VerifyTeamFoundationSharePointExtensions(ICredentials credentials, Uri url)    at Microsoft.TeamFoundation.Admin.VerifySharePointSitesUrl.Verify() Inner Exception Details: Exception Message: TF249064: The following Web service returned an response that is not valid: http://projects.ssw.com.au/_vti_bin/TeamFoundationIntegrationService.asmx. This Web service is used for the Team Foundation Server Extensions for SharePoint Products. Either the extensions are not installed, the request resulted in HTML being returned, or there is a problem with the URL. Verify that the following URL points to a valid SharePoint Web application and that the application is available: http://projects.ssw.com.au. If the URL is correct and the Web application is operating normally, verify that a firewall is not blocking access to the Web application. (type TeamFoundationServerInvalidResponseException) Exception Data Dictionary: ResponseStatusCode = InternalServerError I’ll look at SharePoint after, probably the SharePoint box just needs a restart or a kick If there is a problem with SharePoint it will come out in testing, But I will definatly be passing this on to Microsoft.   Upgrading the SharePoint connector to TFS 2010 You will need to upgrade the Extensions for SharePoint Products and Technologies on all of your SharePoint farm front end servers. To do this uninstall  the TFS 2010 RC from it in the same way as the server, and then install just the RTM Extensions. Figure: Only install the SharePoint Extensions on your SharePoint front end servers. TFS 2010 supports both SharePoint 2007 and SharePoint 2010.   Figure: When you configure SharePoint it uploads all of the solutions and templates. Figure: Everything is uploaded Successfully. Figure: TFS even remembered the settings from the previous installation, fantastic.   Upgrading the Team Foundation Build Servers to TFS 2010 Just like on the SharePoint servers you will need to upgrade the Build Server to the RTM. Just uninstall TFS 2010 RC and then install only the Team Foundation Build Services component. Unlike on the SharePoint server you will probably have some version of Visual Studio installed. You will need to remove this as well. (Coming Soon) Connecting Visual Studio 2010 / 2008 / 2005 and Eclipse to TFS2010 If you have developers still on Visual Studio 2005 or 2008 you will need do download the respective compatibility pack: Visual Studio Team System 2005 Service Pack 1 Forward Compatibility Update for Team Foundation Server 2010 Visual Studio Team System 2008 Service Pack 1 Forward Compatibility Update for Team Foundation Server 2010 If you are using Eclipse you can download the new Team Explorer Everywhere install for connecting to TFS. Get your developers to check that you have the latest version of your applications with SSW Diagnostic which will check for Service Packs and hot fixes to Visual Studio as well.   Technorati Tags: TFS,TFS2010,TFS 2010,Upgrade

    Read the article

  • Azure Mobile Services: what files does it consist of?

    - by svdoever
    Azure Mobile Services is a platform that provides a small set of functionality consisting of authentication, custom data tables, custom API’s, scheduling scripts and push notifications to be used as the back-end of a mobile application or if you want, any application or web site. As described in my previous post Azure Mobile Services: lessons learned the documentation on what can be used in the custom scripts is a bit minimalistic. The list below of all files the complete Azure Mobile Services platform consists of ca shed some light on what is available in the platform. In following posts I will provide more detailed information on what we can conclude from this list of files. Below are the available files as available in the Azure Mobile Services platform. The bold files are files that describe your data model, api scripts, scheduler scripts and table scripts. Those are the files you configure/construct to provide the “configuration”/implementation of you mobile service. The files are located in a folder like C:\DWASFiles\Sites\youreservice\VirtualDirectory0\site\wwwroot. One file is missing in the list below and that is the event log file C:\DWASFiles\Sites\youreservice\VirtualDirectory0\site\LogFiles\eventlog.xml where your messages written with for example console.log() and exception catched by the system are written. NOTA BENE: the Azure Mobile Services system is a system that is under full development, new releases may change the list of files. ./app.js ./App_Data/config/datamodel.json ./App_Data/config/scripts/api/youreapi.js ./App_Data/config/scripts/api/youreapi.json ./App_Data/config/scripts/scheduler/placeholder ./App_Data/config/scripts/scheduler/youresheduler.js ./App_Data/config/scripts/shared/placeholder ./App_Data/config/scripts/table/placeholder ./App_Data/config/scripts/table/yourtable.insert.js ./App_Data/config/scripts/table/yourtable.update.js ./App_Data/config/scripts/table/yourtable.delete.js ./App_Data/config/scripts/table/yourtable.read.js ./node_modules/apn/index.js ./node_modules/apn/lib/connection.js ./node_modules/apn/lib/device.js ./node_modules/apn/lib/errors.js ./node_modules/apn/lib/feedback.js ./node_modules/apn/lib/notification.js ./node_modules/apn/lib/util.js ./node_modules/apn/node_modules/q/package.json ./node_modules/apn/node_modules/q/q.js ./node_modules/apn/package.json ./node_modules/azure/lib/azure.js ./node_modules/azure/lib/cli/blobUtils.js ./node_modules/azure/lib/cli/cacheUtils.js ./node_modules/azure/lib/cli/callbackAggregator.js ./node_modules/azure/lib/cli/cert.js ./node_modules/azure/lib/cli/channel.js ./node_modules/azure/lib/cli/cli.js ./node_modules/azure/lib/cli/commands/account.js ./node_modules/azure/lib/cli/commands/config.js ./node_modules/azure/lib/cli/commands/deployment.js ./node_modules/azure/lib/cli/commands/deployment_.js ./node_modules/azure/lib/cli/commands/help.js ./node_modules/azure/lib/cli/commands/log.js ./node_modules/azure/lib/cli/commands/log_.js ./node_modules/azure/lib/cli/commands/repository.js ./node_modules/azure/lib/cli/commands/repository_.js ./node_modules/azure/lib/cli/commands/service.js ./node_modules/azure/lib/cli/commands/site.js ./node_modules/azure/lib/cli/commands/site_.js ./node_modules/azure/lib/cli/commands/vm.js ./node_modules/azure/lib/cli/common.js ./node_modules/azure/lib/cli/constants.js ./node_modules/azure/lib/cli/generate-psm1-utils.js ./node_modules/azure/lib/cli/generate-psm1.js ./node_modules/azure/lib/cli/iaas/blobserviceex.js ./node_modules/azure/lib/cli/iaas/deleteImage.js ./node_modules/azure/lib/cli/iaas/image.js ./node_modules/azure/lib/cli/iaas/upload/blobInfo.js ./node_modules/azure/lib/cli/iaas/upload/bufferStream.js ./node_modules/azure/lib/cli/iaas/upload/intSet.js ./node_modules/azure/lib/cli/iaas/upload/jobTracker.js ./node_modules/azure/lib/cli/iaas/upload/pageBlob.js ./node_modules/azure/lib/cli/iaas/upload/streamMerger.js ./node_modules/azure/lib/cli/iaas/upload/uploadVMImage.js ./node_modules/azure/lib/cli/iaas/upload/vhdTools.js ./node_modules/azure/lib/cli/keyFiles.js ./node_modules/azure/lib/cli/patch-winston.js ./node_modules/azure/lib/cli/templates/node/iisnode.yml ./node_modules/azure/lib/cli/utils.js ./node_modules/azure/lib/diagnostics/logger.js ./node_modules/azure/lib/http/webresource.js ./node_modules/azure/lib/serviceruntime/fileinputchannel.js ./node_modules/azure/lib/serviceruntime/goalstatedeserializer.js ./node_modules/azure/lib/serviceruntime/namedpipeinputchannel.js ./node_modules/azure/lib/serviceruntime/namedpipeoutputchannel.js ./node_modules/azure/lib/serviceruntime/protocol1runtimeclient.js ./node_modules/azure/lib/serviceruntime/protocol1runtimecurrentstateclient.js ./node_modules/azure/lib/serviceruntime/protocol1runtimegoalstateclient.js ./node_modules/azure/lib/serviceruntime/roleenvironment.js ./node_modules/azure/lib/serviceruntime/runtimekernel.js ./node_modules/azure/lib/serviceruntime/runtimeversionmanager.js ./node_modules/azure/lib/serviceruntime/runtimeversionprotocolclient.js ./node_modules/azure/lib/serviceruntime/xmlcurrentstateserializer.js ./node_modules/azure/lib/serviceruntime/xmlgoalstatedeserializer.js ./node_modules/azure/lib/serviceruntime/xmlroleenvironmentdatadeserializer.js ./node_modules/azure/lib/services/blob/blobservice.js ./node_modules/azure/lib/services/blob/hmacsha256sign.js ./node_modules/azure/lib/services/blob/models/blobresult.js ./node_modules/azure/lib/services/blob/models/blocklistresult.js ./node_modules/azure/lib/services/blob/models/containeraclresult.js ./node_modules/azure/lib/services/blob/models/containerresult.js ./node_modules/azure/lib/services/blob/models/leaseresult.js ./node_modules/azure/lib/services/blob/models/listblobsresultcontinuation.js ./node_modules/azure/lib/services/blob/models/listcontainersresultcontinuation.js ./node_modules/azure/lib/services/blob/models/servicepropertiesresult.js ./node_modules/azure/lib/services/blob/sharedaccesssignature.js ./node_modules/azure/lib/services/blob/sharedkey.js ./node_modules/azure/lib/services/blob/sharedkeylite.js ./node_modules/azure/lib/services/core/connectionstringparser.js ./node_modules/azure/lib/services/core/exponentialretrypolicyfilter.js ./node_modules/azure/lib/services/core/linearretrypolicyfilter.js ./node_modules/azure/lib/services/core/servicebusserviceclient.js ./node_modules/azure/lib/services/core/servicebussettings.js ./node_modules/azure/lib/services/core/serviceclient.js ./node_modules/azure/lib/services/core/servicemanagementclient.js ./node_modules/azure/lib/services/core/servicemanagementsettings.js ./node_modules/azure/lib/services/core/servicesettings.js ./node_modules/azure/lib/services/core/storageserviceclient.js ./node_modules/azure/lib/services/core/storageservicesettings.js ./node_modules/azure/lib/services/queue/models/listqueuesresultcontinuation.js ./node_modules/azure/lib/services/queue/models/queuemessageresult.js ./node_modules/azure/lib/services/queue/models/queueresult.js ./node_modules/azure/lib/services/queue/models/servicepropertiesresult.js ./node_modules/azure/lib/services/queue/queueservice.js ./node_modules/azure/lib/services/serviceBus/models/acstokenresult.js ./node_modules/azure/lib/services/serviceBus/models/queuemessageresult.js ./node_modules/azure/lib/services/serviceBus/models/queueresult.js ./node_modules/azure/lib/services/serviceBus/models/ruleresult.js ./node_modules/azure/lib/services/serviceBus/models/subscriptionresult.js ./node_modules/azure/lib/services/serviceBus/models/topicresult.js ./node_modules/azure/lib/services/serviceBus/servicebusservice.js ./node_modules/azure/lib/services/serviceBus/wrap.js ./node_modules/azure/lib/services/serviceBus/wrapservice.js ./node_modules/azure/lib/services/serviceBus/wraptokenmanager.js ./node_modules/azure/lib/services/serviceManagement/models/roleparser.js ./node_modules/azure/lib/services/serviceManagement/models/roleschema.json ./node_modules/azure/lib/services/serviceManagement/models/servicemanagementserialize.js ./node_modules/azure/lib/services/serviceManagement/servicemanagementservice.js ./node_modules/azure/lib/services/table/batchserviceclient.js ./node_modules/azure/lib/services/table/models/entityresult.js ./node_modules/azure/lib/services/table/models/queryentitiesresultcontinuation.js ./node_modules/azure/lib/services/table/models/querytablesresultcontinuation.js ./node_modules/azure/lib/services/table/models/servicepropertiesresult.js ./node_modules/azure/lib/services/table/models/tableresult.js ./node_modules/azure/lib/services/table/sharedkeylitetable.js ./node_modules/azure/lib/services/table/sharedkeytable.js ./node_modules/azure/lib/services/table/tablequery.js ./node_modules/azure/lib/services/table/tableservice.js ./node_modules/azure/lib/util/atomhandler.js ./node_modules/azure/lib/util/certificates/der.js ./node_modules/azure/lib/util/certificates/pkcs.js ./node_modules/azure/lib/util/constants.js ./node_modules/azure/lib/util/iso8061date.js ./node_modules/azure/lib/util/js2xml.js ./node_modules/azure/lib/util/rfc1123date.js ./node_modules/azure/lib/util/util.js ./node_modules/azure/lib/util/validate.js ./node_modules/azure/LICENSE.txt ./node_modules/azure/node_modules/async/index.js ./node_modules/azure/node_modules/async/lib/async.js ./node_modules/azure/node_modules/async/LICENSE ./node_modules/azure/node_modules/async/package.json ./node_modules/azure/node_modules/azure/lib/azure.js ./node_modules/azure/node_modules/azure/lib/diagnostics/logger.js ./node_modules/azure/node_modules/azure/lib/http/webresource.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/fileinputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/goalstatedeserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/namedpipeinputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/namedpipeoutputchannel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimeclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimecurrentstateclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/protocol1runtimegoalstateclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/roleenvironment.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimekernel.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimeversionmanager.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/runtimeversionprotocolclient.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlcurrentstateserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlgoalstatedeserializer.js ./node_modules/azure/node_modules/azure/lib/serviceruntime/xmlroleenvironmentdatadeserializer.js ./node_modules/azure/node_modules/azure/lib/services/blob/blobservice.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedaccesssignature.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedkey.js ./node_modules/azure/node_modules/azure/lib/services/blob/internal/sharedkeylite.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/blobresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/blocklistresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/containeraclresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/containerresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/leaseresult.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/listblobsresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/listcontainersresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/blob/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/core/connectionstringparser.js ./node_modules/azure/node_modules/azure/lib/services/core/exponentialretrypolicyfilter.js ./node_modules/azure/node_modules/azure/lib/services/core/hmacsha256sign.js ./node_modules/azure/node_modules/azure/lib/services/core/linearretrypolicyfilter.js ./node_modules/azure/node_modules/azure/lib/services/core/servicebusserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicebussettings.js ./node_modules/azure/node_modules/azure/lib/services/core/serviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicemanagementclient.js ./node_modules/azure/node_modules/azure/lib/services/core/servicemanagementsettings.js ./node_modules/azure/node_modules/azure/lib/services/core/servicesettings.js ./node_modules/azure/node_modules/azure/lib/services/core/sqlserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/storageserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/core/storageservicesettings.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/listqueuesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/queuemessageresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/queueresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/queue/queueservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/apnsservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/gcmservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/sharedaccesssignature.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/wrap.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/internal/wraptokenmanager.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/acstokenresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/notificationhubresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/queuemessageresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/queueresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/registrationresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/resourceresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/ruleresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/subscriptionresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/models/topicresult.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/notificationhubservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/servicebusservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/servicebusservicebase.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/wnsservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceBus/wrapservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/hdinsightservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/roleparser.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/roleschema.json ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/models/servicemanagementserialize.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/servicebusmanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/servicemanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/serviceManagement/sqlmanagementservice.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/models/databaseresult.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/sqlserveracs.js ./node_modules/azure/node_modules/azure/lib/services/sqlAzure/sqlservice.js ./node_modules/azure/node_modules/azure/lib/services/table/batchserviceclient.js ./node_modules/azure/node_modules/azure/lib/services/table/internal/sharedkeylitetable.js ./node_modules/azure/node_modules/azure/lib/services/table/internal/sharedkeytable.js ./node_modules/azure/node_modules/azure/lib/services/table/models/entityresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/listresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/queryentitiesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/table/models/querytablesresultcontinuation.js ./node_modules/azure/node_modules/azure/lib/services/table/models/servicepropertiesresult.js ./node_modules/azure/node_modules/azure/lib/services/table/models/tableresult.js ./node_modules/azure/node_modules/azure/lib/services/table/tablequery.js ./node_modules/azure/node_modules/azure/lib/services/table/tableservice.js ./node_modules/azure/node_modules/azure/lib/util/atomhandler.js ./node_modules/azure/node_modules/azure/lib/util/constants.js ./node_modules/azure/node_modules/azure/lib/util/date.js ./node_modules/azure/node_modules/azure/lib/util/edmtype.js ./node_modules/azure/node_modules/azure/lib/util/iso8061date.js ./node_modules/azure/node_modules/azure/lib/util/js2xml.js ./node_modules/azure/node_modules/azure/lib/util/odatahandler.js ./node_modules/azure/node_modules/azure/lib/util/rfc1123date.js ./node_modules/azure/node_modules/azure/lib/util/util.js ./node_modules/azure/node_modules/azure/lib/util/validate.js ./node_modules/azure/node_modules/azure/LICENSE.txt ./node_modules/azure/node_modules/azure/node_modules/wns/lib/wns.js ./node_modules/azure/node_modules/azure/node_modules/wns/LICENSE.txt ./node_modules/azure/node_modules/azure/node_modules/wns/package.json ./node_modules/azure/node_modules/azure/node_modules/xml2js/lib/xml2js.js ./node_modules/azure/node_modules/azure/node_modules/xml2js/LICENSE ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/lib/sax.js ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/LICENSE ./node_modules/azure/node_modules/azure/node_modules/xml2js/node_modules/sax/package.json ./node_modules/azure/node_modules/azure/node_modules/xml2js/package.json ./node_modules/azure/node_modules/azure/package.json ./node_modules/azure/node_modules/colors/colors.js ./node_modules/azure/node_modules/colors/MIT-LICENSE.txt ./node_modules/azure/node_modules/colors/package.json ./node_modules/azure/node_modules/commander/index.js ./node_modules/azure/node_modules/commander/lib/commander.js ./node_modules/azure/node_modules/commander/node_modules/keypress/index.js ./node_modules/azure/node_modules/commander/node_modules/keypress/package.json ./node_modules/azure/node_modules/commander/package.json ./node_modules/azure/node_modules/dateformat/lib/dateformat.js ./node_modules/azure/node_modules/dateformat/package.json ./node_modules/azure/node_modules/easy-table/lib/table.js ./node_modules/azure/node_modules/easy-table/package.json ./node_modules/azure/node_modules/eyes/lib/eyes.js ./node_modules/azure/node_modules/eyes/LICENSE ./node_modules/azure/node_modules/eyes/package.json ./node_modules/azure/node_modules/log/index.js ./node_modules/azure/node_modules/log/lib/log.js ./node_modules/azure/node_modules/log/package.json ./node_modules/azure/node_modules/mime/LICENSE ./node_modules/azure/node_modules/mime/mime.js ./node_modules/azure/node_modules/mime/package.json ./node_modules/azure/node_modules/mime/types/mime.types ./node_modules/azure/node_modules/mime/types/node.types ./node_modules/azure/node_modules/node-uuid/LICENSE.md ./node_modules/azure/node_modules/node-uuid/package.json ./node_modules/azure/node_modules/node-uuid/uuid.js ./node_modules/azure/node_modules/qs/component.json ./node_modules/azure/node_modules/qs/index.js ./node_modules/azure/node_modules/qs/lib/head.js ./node_modules/azure/node_modules/qs/lib/querystring.js ./node_modules/azure/node_modules/qs/lib/tail.js ./node_modules/azure/node_modules/qs/package.json ./node_modules/azure/node_modules/qs/querystring.js ./node_modules/azure/node_modules/request/aws.js ./node_modules/azure/node_modules/request/forever.js ./node_modules/azure/node_modules/request/LICENSE ./node_modules/azure/node_modules/request/main.js ./node_modules/azure/node_modules/request/node_modules/form-data/lib/form_data.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/index.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/lib/async.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/LICENSE ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/async/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/lib/combined_stream.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/License ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/lib/delayed_stream.js ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/License ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/node_modules/delayed-stream/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/node_modules/combined-stream/package.json ./node_modules/azure/node_modules/request/node_modules/form-data/package.json ./node_modules/azure/node_modules/request/node_modules/mime/LICENSE ./node_modules/azure/node_modules/request/node_modules/mime/mime.js ./node_modules/azure/node_modules/request/node_modules/mime/package.json ./node_modules/azure/node_modules/request/node_modules/mime/types/mime.types ./node_modules/azure/node_modules/request/node_modules/mime/types/node.types ./node_modules/azure/node_modules/request/oauth.js ./node_modules/azure/node_modules/request/package.json ./node_modules/azure/node_modules/request/tunnel.js ./node_modules/azure/node_modules/request/uuid.js ./node_modules/azure/node_modules/request/vendor/cookie/index.js ./node_modules/azure/node_modules/request/vendor/cookie/jar.js ./node_modules/azure/node_modules/sax/lib/sax.js ./node_modules/azure/node_modules/sax/LICENSE ./node_modules/azure/node_modules/sax/package.json ./node_modules/azure/node_modules/streamline/AUTHORS ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/decompiler.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/definitions.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsbrowser.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsdecomp.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsdefs.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsexec.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jslex.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/jsparse.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/lexer.js ./node_modules/azure/node_modules/streamline/deps/narcissus/lib/parser.js ./node_modules/azure/node_modules/streamline/deps/narcissus/LICENSE ./node_modules/azure/node_modules/streamline/deps/narcissus/main.js ./node_modules/azure/node_modules/streamline/deps/narcissus/package.json ./node_modules/azure/node_modules/streamline/deps/narcissus/xfail/narcissus-failures.txt ./node_modules/azure/node_modules/streamline/deps/narcissus/xfail/narcissus-slow.txt ./node_modules/azure/node_modules/streamline/lib/callbacks/format.js ./node_modules/azure/node_modules/streamline/lib/callbacks/require-stub.js ./node_modules/azure/node_modules/streamline/lib/callbacks/runtime.js ./node_modules/azure/node_modules/streamline/lib/callbacks/transform.js ./node_modules/azure/node_modules/streamline/lib/compile.js ./node_modules/azure/node_modules/streamline/lib/compiler/command.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile--fibers.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile.js ./node_modules/azure/node_modules/streamline/lib/compiler/compile_.js ./node_modules/azure/node_modules/streamline/lib/compiler/index.js ./node_modules/azure/node_modules/streamline/lib/compiler/register.js ./node_modules/azure/node_modules/streamline/lib/fibers/runtime.js ./node_modules/azure/node_modules/streamline/lib/fibers/transform.js ./node_modules/azure/node_modules/streamline/lib/fibers/walker.js ./node_modules/azure/node_modules/streamline/lib/globals.js ./node_modules/azure/node_modules/streamline/lib/index.js ./node_modules/azure/node_modules/streamline/lib/register.js ./node_modules/azure/node_modules/streamline/lib/require/client/require.js ./node_modules/azure/node_modules/streamline/lib/require/server/depend.js ./node_modules/azure/node_modules/streamline/lib/require/server/require.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams--fibers.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams.js ./node_modules/azure/node_modules/streamline/lib/streams/client/streams_.js ./node_modules/azure/node_modules/streamline/lib/streams/jsonRequest.js ./node_modules/azure/node_modules/streamline/lib/streams/readers.js ./node_modules/azure/node_modules/streamline/lib/streams/server/httpHelper.js ./node_modules/azure/node_modules/streamline/lib/streams/server/streams.js ./node_modules/azure/node_modules/streamline/lib/streams/streams.js ./node_modules/azure/node_modules/streamline/lib/tools/docTool.js ./node_modules/azure/node_modules/streamline/lib/transform.js ./node_modules/azure/node_modules/streamline/lib/util/flows--fibers.js ./node_modules/azure/node_modules/streamline/lib/util/flows.js ./node_modules/azure/node_modules/streamline/lib/util/flows_.js ./node_modules/azure/node_modules/streamline/lib/util/future.js ./node_modules/azure/node_modules/streamline/lib/util/index.js ./node_modules/azure/node_modules/streamline/lib/util/url.js ./node_modules/azure/node_modules/streamline/lib/util/uuid.js ./node_modules/azure/node_modules/streamline/module.js ./node_modules/azure/node_modules/streamline/package.json ./node_modules/azure/node_modules/tunnel/index.js ./node_modules/azure/node_modules/tunnel/lib/tunnel.js ./node_modules/azure/node_modules/tunnel/package.json ./node_modules/azure/node_modules/underscore/index.js ./node_modules/azure/node_modules/underscore/LICENSE ./node_modules/azure/node_modules/underscore/package.json ./node_modules/azure/node_modules/underscore/underscore.js ./node_modules/azure/node_modules/underscore.string/lib/underscore.string.js ./node_modules/azure/node_modules/underscore.string/package.json ./node_modules/azure/node_modules/validator/index.js ./node_modules/azure/node_modules/validator/lib/defaultError.js ./node_modules/azure/node_modules/validator/lib/entities.js ./node_modules/azure/node_modules/validator/lib/filter.js ./node_modules/azure/node_modules/validator/lib/index.js ./node_modules/azure/node_modules/validator/lib/validator.js ./node_modules/azure/node_modules/validator/lib/validators.js ./node_modules/azure/node_modules/validator/lib/xss.js ./node_modules/azure/node_modules/validator/LICENSE ./node_modules/azure/node_modules/validator/package.json ./node_modules/azure/node_modules/validator/validator.js ./node_modules/azure/node_modules/winston/lib/winston/common.js ./node_modules/azure/node_modules/winston/lib/winston/config/cli-config.js ./node_modules/azure/node_modules/winston/lib/winston/config/npm-config.js ./node_modules/azure/node_modules/winston/lib/winston/config/syslog-config.js ./node_modules/azure/node_modules/winston/lib/winston/config.js ./node_modules/azure/node_modules/winston/lib/winston/container.js ./node_modules/azure/node_modules/winston/lib/winston/exception.js ./node_modules/azure/node_modules/winston/lib/winston/logger.js ./node_modules/azure/node_modules/winston/lib/winston/transports/console.js ./node_modules/azure/node_modules/winston/lib/winston/transports/file.js ./node_modules/azure/node_modules/winston/lib/winston/transports/http.js ./node_modules/azure/node_modules/winston/lib/winston/transports/transport.js ./node_modules/azure/node_modules/winston/lib/winston/transports/webhook.js ./node_modules/azure/node_modules/winston/lib/winston/transports.js ./node_modules/azure/node_modules/winston/lib/winston.js ./node_modules/azure/node_modules/winston/LICENSE ./node_modules/azure/node_modules/winston/node_modules/cycle/cycle.js ./node_modules/azure/node_modules/winston/node_modules/cycle/package.json ./node_modules/azure/node_modules/winston/node_modules/pkginfo/lib/pkginfo.js ./node_modules/azure/node_modules/winston/node_modules/pkginfo/package.json ./node_modules/azure/node_modules/winston/node_modules/request/aws.js ./node_modules/azure/node_modules/winston/node_modules/request/aws2.js ./node_modules/azure/node_modules/winston/node_modules/request/forever.js ./node_modules/azure/node_modules/winston/node_modules/request/LICENSE ./node_modules/azure/node_modules/winston/node_modules/request/main.js ./node_modules/azure/node_modules/winston/node_modules/request/mimetypes.js ./node_modules/azure/node_modules/winston/node_modules/request/oauth.js ./node_modules/azure/node_modules/winston/node_modules/request/package.json ./node_modules/azure/node_modules/winston/node_modules/request/tunnel.js ./node_modules/azure/node_modules/winston/node_modules/request/uuid.js ./node_modules/azure/node_modules/winston/node_modules/request/vendor/cookie/index.js ./node_modules/azure/node_modules/winston/node_modules/request/vendor/cookie/jar.js ./node_modules/azure/node_modules/winston/node_modules/stack-trace/lib/stack-trace.js ./node_modules/azure/node_modules/winston/node_modules/stack-trace/License ./node_modules/azure/node_modules/winston/node_modules/stack-trace/package.json ./node_modules/azure/node_modules/winston/package.json ./node_modules/azure/node_modules/xml2js/lib/xml2js.js ./node_modules/azure/node_modules/xml2js/LICENSE ./node_modules/azure/node_modules/xml2js/package.json ./node_modules/azure/node_modules/xmlbuilder/lib/index.js ./node_modules/azure/node_modules/xmlbuilder/lib/XMLBuilder.js ./node_modules/azure/node_modules/xmlbuilder/lib/XMLFragment.js ./node_modules/azure/node_modules/xmlbuilder/package.json ./node_modules/azure/package.json ./node_modules/dpush/lib/dpush.js ./node_modules/dpush/LICENSE.txt ./node_modules/dpush/package.json ./node_modules/express/.npmignore ./node_modules/express/.travis.yml ./node_modules/express/bin/express ./node_modules/express/History.md ./node_modules/express/index.js ./node_modules/express/lib/application.js ./node_modules/express/lib/express.js ./node_modules/express/lib/middleware.js ./node_modules/express/lib/request.js ./node_modules/express/lib/response.js ./node_modules/express/lib/router/index.js ./node_modules/express/lib/router/route.js ./node_modules/express/lib/utils.js ./node_modules/express/lib/view.js ./node_modules/express/LICENSE ./node_modules/express/Makefile ./node_modules/express/node_modules/buffer-crc32/.npmignore ./node_modules/express/node_modules/buffer-crc32/.travis.yml ./node_modules/express/node_modules/buffer-crc32/index.js ./node_modules/express/node_modules/buffer-crc32/package.json ./node_modules/express/node_modules/buffer-crc32/README.md ./node_modules/express/node_modules/buffer-crc32/tests/crc.test.js ./node_modules/express/node_modules/commander/.npmignore ./node_modules/express/node_modules/commander/.travis.yml ./node_modules/express/node_modules/commander/History.md ./node_modules/express/node_modules/commander/index.js ./node_modules/express/node_modules/commander/lib/commander.js ./node_modules/express/node_modules/commander/Makefile ./node_modules/express/node_modules/commander/package.json ./node_modules/express/node_modules/commander/Readme.md ./node_modules/express/node_modules/connect/.npmignore ./node_modules/express/node_modules/connect/.travis.yml ./node_modules/express/node_modules/connect/index.js ./node_modules/express/node_modules/connect/lib/cache.js ./node_modules/express/node_modules/connect/lib/connect.js ./node_modules/express/node_modules/connect/lib/index.js ./node_modules/express/node_modules/connect/lib/middleware/basicAuth.js ./node_modules/express/node_modules/connect/lib/middleware/bodyParser.js ./node_modules/express/node_modules/connect/lib/middleware/compress.js ./node_modules/express/node_modules/connect/lib/middleware/cookieParser.js ./node_modules/express/node_modules/connect/lib/middleware/cookieSession.js ./node_modules/express/node_modules/connect/lib/middleware/csrf.js ./node_modules/express/node_modules/connect/lib/middleware/directory.js ./node_modules/express/node_modules/connect/lib/middleware/errorHandler.js ./node_modules/express/node_modules/connect/lib/middleware/favicon.js ./node_modules/express/node_modules/connect/lib/middleware/json.js ./node_modules/express/node_modules/connect/lib/middleware/limit.js ./node_modules/express/node_modules/connect/lib/middleware/logger.js ./node_modules/express/node_modules/connect/lib/middleware/methodOverride.js ./node_modules/express/node_modules/connect/lib/middleware/multipart.js ./node_modules/express/node_modules/connect/lib/middleware/query.js ./node_modules/express/node_modules/connect/lib/middleware/responseTime.js ./node_modules/express/node_modules/connect/lib/middleware/session/cookie.js ./node_modules/express/node_modules/connect/lib/middleware/session/memory.js ./node_modules/express/node_modules/connect/lib/middleware/session/session.js ./node_modules/express/node_modules/connect/lib/middleware/session/store.js ./node_modules/express/node_modules/connect/lib/middleware/session.js ./node_modules/express/node_modules/connect/lib/middleware/static.js ./node_modules/express/node_modules/connect/lib/middleware/staticCache.js ./node_modules/express/node_modules/connect/lib/middleware/timeout.js ./node_modules/express/node_modules/connect/lib/middleware/urlencoded.js ./node_modules/express/node_modules/connect/lib/middleware/vhost.js ./node_modules/express/node_modules/connect/lib/patch.js ./node_modules/express/node_modules/connect/lib/proto.js ./node_modules/express/node_modules/connect/lib/public/directory.html ./node_modules/express/node_modules/connect/lib/public/error.html ./node_modules/express/node_modules/connect/lib/public/favicon.ico ./node_modules/express/node_modules/connect/lib/public/icons/page.png ./node_modules/express/node_modules/connect/lib/public/icons/page_add.png ./node_modules/express/node_modules/connect/lib/public/icons/page_attach.png ./node_modules/express/node_modules/connect/lib/public/icons/page_code.png ./node_modules/express/node_modules/connect/lib/public/icons/page_copy.png ./node_modules/express/node_modules/connect/lib/public/icons/page_delete.png ./node_modules/express/node_modules/connect/lib/public/icons/page_edit.png ./node_modules/express/node_modules/connect/lib/public/icons/page_error.png ./node_modules/express/node_modules/connect/lib/public/icons/page_excel.png ./node_modules/express/node_modules/connect/lib/public/icons/page_find.png ./node_modules/express/node_modules/connect/lib/public/icons/page_gear.png ./node_modules/express/node_modules/connect/lib/public/icons/page_go.png ./node_modules/express/node_modules/connect/lib/public/icons/page_green.png ./node_modules/express/node_modules/connect/lib/public/icons/page_key.png ./node_modules/express/node_modules/connect/lib/public/icons/page_lightning.png ./node_modules/express/node_modules/connect/lib/public/icons/page_link.png ./node_modules/express/node_modules/connect/lib/public/icons/page_paintbrush.png ./node_modules/express/node_modules/connect/lib/public/icons/page_paste.png ./node_modules/express/node_modules/connect/lib/public/icons/page_red.png ./node_modules/express/node_modules/connect/lib/public/icons/page_refresh.png ./node_modules/express/node_modules/connect/lib/public/icons/page_save.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_acrobat.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_actionscript.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_add.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_c.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_camera.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cd.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_code.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_code_red.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_coldfusion.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_compressed.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_copy.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cplusplus.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_csharp.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_cup.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_database.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_delete.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_dvd.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_edit.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_error.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_excel.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_find.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_flash.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_freehand.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_gear.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_get.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_go.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_h.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_horizontal.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_key.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_lightning.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_link.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_magnify.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_medal.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_office.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paint.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paintbrush.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_paste.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_php.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_picture.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_powerpoint.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_put.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_ruby.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_stack.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_star.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_swoosh.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_text.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_text_width.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_tux.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_vector.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_visualstudio.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_width.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_word.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_world.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_wrench.png ./node_modules/express/node_modules/connect/lib/public/icons/page_white_zip.png ./node_modules/express/node_modules/connect/lib/public/icons/page_word.png ./node_modules/express/node_modules/connect/lib/public/icons/page_world.png ./node_modules/express/node_modules/connect/lib/public/style.css ./node_modules/express/node_modules/connect/lib/utils.js ./node_modules/express/node_modules/connect/LICENSE ./node_modules/express/node_modules/connect/node_modules/bytes/.npmignore ./node_modules/express/node_modules/connect/node_modules/bytes/component.json ./node_modules/express/node_modules/connect/node_modules/bytes/History.md ./node_modules/express/node_modules/connect/node_modules/bytes/index.js ./node_modules/express/node_modules/connect/node_modules/bytes/Makefile ./node_modules/express/node_modules/connect/node_modules/bytes/package.json ./node_modules/express/node_modules/connect/node_modules/bytes/Readme.md ./node_modules/express/node_modules/connect/node_modules/formidable/.npmignore ./node_modules/express/node_modules/connect/node_modules/formidable/.travis.yml ./node_modules/express/node_modules/connect/node_modules/formidable/benchmark/bench-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/json.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/post.js ./node_modules/express/node_modules/connect/node_modules/formidable/example/upload.js ./node_modules/express/node_modules/connect/node_modules/formidable/index.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/file.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/incoming_form.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/index.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/json_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/multipart_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/octet_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/lib/querystring_parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/LICENSE ./node_modules/express/node_modules/connect/node_modules/formidable/package.json ./node_modules/express/node_modules/connect/node_modules/formidable/Readme.md ./node_modules/express/node_modules/connect/node_modules/formidable/test/common.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/beta-sticker-1.png ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/binaryfile.tar.gz ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/blank.gif ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/funkyfilename.txt ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/menu_separator.png ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/file/plain.txt ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/http/special-chars-in-filename/info.md ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/encoding.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/misc.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/no-filename.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/preamble.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/special-chars-in-filename.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/js/workarounds.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/fixture/multipart.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-fixtures.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-json.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/integration/test-octet-stream.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/common.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/integration/test-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-file.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-incoming-form.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-multipart-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/simple/test-querystring-parser.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/legacy/system/test-multi-video-upload.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/run.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-connection-aborted.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-content-transfer-encoding.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/standalone/test-issue-46.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/tools/base64.html ./node_modules/express/node_modules/connect/node_modules/formidable/test/unit/test-file.js ./node_modules/express/node_modules/connect/node_modules/formidable/test/unit/test-incoming-form.js ./node_modules/express/node_modules/connect/node_modules/formidable/tool/record.js ./node_modules/express/node_modules/connect/node_modules/pause/.npmignore ./node_modules/express/node_modules/connect/node_modules/pause/History.md ./node_modules/express/node_modules/connect/node_modules/pause/index.js ./node_modules/express/node_modules/connect/node_modules/pause/Makefile ./node_modules/express/node_modules/connect/node_modules/pause/package.json ./node_modules/express/node_modules/connect/node_modules/pause/Readme.md ./node_modules/express/node_modules/connect/node_modules/qs/.gitmodules ./node_modules/express/node_modules/connect/node_modules/qs/.npmignore ./node_modules/express/node_modules/connect/node_modules/qs/index.js ./node_modules/express/node_modules/connect/node_modules/qs/package.json ./node_modules/express/node_modules/connect/node_modules/qs/Readme.md ./node_modules/express/node_modules/connect/package.json ./node_modules/express/node_modules/connect/test.js ./node_modules/express/node_modules/cookie/.npmignore ./node_modules/express/node_modules/cookie/.travis.yml ./node_modules/express/node_modules/cookie/index.js ./node_modules/express/node_modules/cookie/package.json ./node_modules/express/node_modules/cookie/README.md ./node_modules/express/node_modules/cookie/test/mocha.opts ./node_modules/express/node_modules/cookie/test/parse.js ./node_modules/express/node_modules/cookie/test/serialize.js ./node_modules/express/node_modules/cookie-signature/.npmignore ./node_modules/express/node_modules/cookie-signature/History.md ./node_modules/express/node_modules/cookie-signature/index.js ./node_modules/express/node_modules/cookie-signature/Makefile ./node_modules/express/node_modules/cookie-signature/package.json ./node_modules/express/node_modules/cookie-signature/Readme.md ./node_modules/express/node_modules/debug/.npmignore ./node_modules/express/node_modules/debug/component.json ./node_modules/express/node_modules/debug/debug.js ./node_modules/express/node_modules/debug/example/app.js ./node_modules/express/node_modules/debug/example/browser.html ./node_modules/express/node_modules/debug/example/wildcards.js ./node_modules/express/node_modules/debug/example/worker.js ./node_modules/express/node_modules/debug/History.md ./node_modules/express/node_modules/debug/index.js ./node_modules/express/node_modules/debug/lib/debug.js ./node_modules/express/node_modules/debug/package.json ./node_modules/express/node_modules/debug/Readme.md ./node_modules/express/node_modules/fresh/.npmignore ./node_modules/express/node_modules/fresh/index.js ./node_modules/express/node_modules/fresh/Makefile ./node_modules/express/node_modules/fresh/package.json ./node_modules/express/node_modules/fresh/Readme.md ./node_modules/express/node_modules/methods/index.js ./node_modules/express/node_modules/methods/package.json ./node_modules/express/node_modules/mkdirp/.npmignore ./node_modules/express/node_modules/mkdirp/.travis.yml ./node_modules/express/node_modules/mkdirp/examples/pow.js ./node_modules/express/node_modules/mkdirp/index.js ./node_modules/express/node_modules/mkdirp/LICENSE ./node_modules/express/node_modules/mkdirp/package.json ./node_modules/express/node_modules/mkdirp/README.markdown ./node_modules/express/node_modules/mkdirp/test/chmod.js ./node_modules/express/node_modules/mkdirp/test/clobber.js ./node_modules/express/node_modules/mkdirp/test/mkdirp.js ./node_modules/express/node_modules/mkdirp/test/perm.js ./node_modules/express/node_modules/mkdirp/test/perm_sync.js ./node_modules/express/node_modules/mkdirp/test/race.js ./node_modules/express/node_modules/mkdirp/test/rel.js ./node_modules/express/node_modules/mkdirp/test/return.js ./node_modules/express/node_modules/mkdirp/test/return_sync.js ./node_modules/express/node_modules/mkdirp/test/root.js ./node_modules/express/node_modules/mkdirp/test/sync.js ./node_modules/express/node_modules/mkdirp/test/umask.js ./node_modules/express/node_modules/mkdirp/test/umask_sync.js ./node_modules/express/node_modules/range-parser/.npmignore ./node_modules/express/node_modules/range-parser/History.md ./node_modules/express/node_modules/range-parser/index.js ./node_modules/express/node_modules/range-parser/Makefile ./node_modules/express/node_modules/range-parser/package.json ./node_modules/express/node_modules/range-parser/Readme.md ./node_modules/express/node_modules/send/.npmignore ./node_modules/express/node_modules/send/History.md ./node_modules/express/node_modules/send/index.js ./node_modules/express/node_modules/send/lib/send.js ./node_modules/express/node_modules/send/lib/utils.js ./node_modules/express/node_modules/send/Makefile ./node_modules/express/node_modules/send/node_modules/mime/LICENSE ./node_modules/express/node_modules/send/node_modules/mime/mime.js ./node_modules/express/node_modules/send/node_modules/mime/package.json ./node_modules/express/node_modules/send/node_modules/mime/README.md ./node_modules/express/node_modules/send/node_modules/mime/test.js ./node_modules/express/node_modules/send/node_modules/mime/types/mime.types ./node_modules/express/node_modules/send/node_modules/mime/types/node.types ./node_modules/express/node_modules/send/package.json ./node_modules/express/node_modules/send/Readme.md ./node_modules/express/package.json ./node_modules/express/Readme.md ./node_modules/mpns/lib/mpns.js ./node_modules/mpns/package.json ./node_modules/oauth/index.js ./node_modules/oauth/lib/oauth.js ./node_modules/oauth/lib/oauth2.js ./node_modules/oauth/lib/sha1.js ./node_modules/oauth/lib/_utils.js ./node_modules/oauth/LICENSE ./node_modules/oauth/package.json ./node_modules/pusher/index.js ./node_modules/pusher/lib/pusher.js ./node_modules/pusher/node_modules/request/aws.js ./node_modules/pusher/node_modules/request/aws2.js ./node_modules/pusher/node_modules/request/forever.js ./node_modules/pusher/node_modules/request/LICENSE ./node_modules/pusher/node_modules/request/main.js ./node_modules/pusher/node_modules/request/mimetypes.js ./node_modules/pusher/node_modules/request/oauth.js ./node_modules/pusher/node_modules/request/package.json ./node_modules/pusher/node_modules/request/tunnel.js ./node_modules/pusher/node_modules/request/uuid.js ./node_modules/pusher/node_modules/request/vendor/cookie/index.js ./node_modules/pusher/node_modules/request/vendor/cookie/jar.js ./node_modules/pusher/package.json ./node_modules/request/forever.js ./node_modules/request/LICENSE ./node_modules/request/main.js ./node_modules/request/mimetypes.js ./node_modules/request/oauth.js ./node_modules/request/package.json ./node_modules/request/uuid.js ./node_modules/request/vendor/cookie/index.js ./node_modules/request/vendor/cookie/jar.js ./node_modules/sax/lib/sax.js ./node_modules/sax/LICENSE ./node_modules/sax/package.json ./node_modules/sendgrid/index.js ./node_modules/sendgrid/lib/email.js ./node_modules/sendgrid/lib/file_handler.js ./node_modules/sendgrid/lib/sendgrid.js ./node_modules/sendgrid/lib/smtpapi_headers.js ./node_modules/sendgrid/lib/validation.js ./node_modules/sendgrid/MIT.LICENSE ./node_modules/sendgrid/node_modules/mime/LICENSE ./node_modules/sendgrid/node_modules/mime/mime.js ./node_modules/sendgrid/node_modules/mime/package.json ./node_modules/sendgrid/node_modules/mime/types/mime.types ./node_modules/sendgrid/node_modules/mime/types/node.types ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/sendmail.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/ses.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/smtp.js ./node_modules/sendgrid/node_modules/nodemailer/lib/engines/stub.js ./node_modules/sendgrid/node_modules/nodemailer/lib/helpers.js ./node_modules/sendgrid/node_modules/nodemailer/lib/nodemailer.js ./node_modules/sendgrid/node_modules/nodemailer/lib/transport.js ./node_modules/sendgrid/node_modules/nodemailer/lib/wellknown.js ./node_modules/sendgrid/node_modules/nodemailer/lib/xoauth.js ./node_modules/sendgrid/node_modules/nodemailer/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/dkim.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/mailcomposer.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/punycode.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/lib/urlfetch.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/content-types.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/index.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/mime-functions.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/node_modules/mimelib-noiconv/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/mailcomposer/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/index.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/client.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/pool.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/server.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/lib/starttls.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/cert/cert.pem ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/cert/key.pem ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/mockup.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/rai.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/lib/starttls.js ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/LICENSE ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/node_modules/rai/package.json ./node_modules/sendgrid/node_modules/nodemailer/node_modules/simplesmtp/package.json ./node_modules/sendgrid/node_modules/nodemailer/package.json ./node_modules/sendgrid/node_modules/step/lib/step.js ./node_modules/sendgrid/node_modules/step/package.json ./node_modules/sendgrid/node_modules/underscore/index.js ./node_modules/sendgrid/node_modules/underscore/LICENSE ./node_modules/sendgrid/node_modules/underscore/package.json ./node_modules/sendgrid/node_modules/underscore/underscore.js ./node_modules/sendgrid/package.json ./node_modules/sqlserver/lib/sql.js ./node_modules/sqlserver/lib/sqlserver.native.js ./node_modules/sqlserver/lib/sqlserver.node ./node_modules/sqlserver/package.json ./node_modules/tripwire/lib/native/windows/x86/tripwire.node ./node_modules/tripwire/lib/tripwire.js ./node_modules/tripwire/LICENSE.txt ./node_modules/tripwire/package.json ./node_modules/underscore/LICENSE ./node_modules/underscore/package.json ./node_modules/underscore/underscore.js ./node_modules/underscore.string/lib/underscore.string.js ./node_modules/underscore.string/package.json ./node_modules/wns/lib/wns.js ./node_modules/wns/LICENSE.txt ./node_modules/wns/package.json ./node_modules/xml2js/lib/xml2js.js ./node_modules/xml2js/LICENSE ./node_modules/xml2js/node_modules/sax/lib/sax.js ./node_modules/xml2js/node_modules/sax/LICENSE ./node_modules/xml2js/node_modules/sax/package.json ./node_modules/xml2js/package.json ./node_modules/xmlbuilder/lib/index.js ./node_modules/xmlbuilder/lib/XMLBuilder.js ./node_modules/xmlbuilder/lib/XMLFragment.js ./node_modules/xmlbuilder/package.json ./runtime/core.js ./runtime/filehelpers.js ./runtime/jsonwebtoken.js ./runtime/logger.js ./runtime/logwriter.js ./runtime/metrics.js ./runtime/query/expressions.js ./runtime/query/expressionvisitor.js ./runtime/query/queryparser.js ./runtime/request/authentication/facebook.js ./runtime/request/authentication/google.js ./runtime/request/authentication/microsoftaccount.js ./runtime/request/authentication/twitter.js ./runtime/request/dataoperation.js ./runtime/request/datapipeline.js ./runtime/request/html/corshelper.js ./runtime/request/html/crossdomainhandler.js ./runtime/request/html/templates/crossdomainbridge.html ./runtime/request/html/templates/loginviaiframe.html ./runtime/request/html/templates/loginviaiframereceiver.html ./runtime/request/html/templates/loginviapostmessage.html ./runtime/request/html/templating.js ./runtime/request/loginhandler.js ./runtime/request/middleware/allowHandler.js ./runtime/request/middleware/authenticate.js ./runtime/request/middleware/authorize.js ./runtime/request/middleware/bodyParser.js ./runtime/request/middleware/errorHandler.js ./runtime/request/middleware/requestLimit.js ./runtime/request/request.js ./runtime/request/requesthandler.js ./runtime/request/schedulerhandler.js ./runtime/request/statushandler.js ./runtime/request/tablehandler.js ./runtime/resources.js ./runtime/script/apibuilder.js ./runtime/script/metadata.js ./runtime/script/push/notify-apns.js ./runtime/script/push/notify-gcm.js ./runtime/script/push/notify-mpns.js ./runtime/script/push/notify-wns.js ./runtime/script/push/notify.js ./runtime/script/scriptcache.js ./runtime/script/scripterror.js ./runtime/script/scriptloader.js ./runtime/script/scriptmanager.js ./runtime/script/scriptstate.js ./runtime/script/sqladapter.js ./runtime/script/table.js ./runtime/server.js ./runtime/statuscodes.js ./runtime/storage/sqlbooleanizer.js ./runtime/storage/sqlformatter.js ./runtime/storage/sqlhelpers.js ./runtime/storage/storage.js ./runtime/Zumo.Node.js ./static/client/MobileServices.Web-1.0.0.js ./static/client/MobileServices.Web-1.0.0.min.js ./static/default.htm ./static/robots.txt ./Web.config

    Read the article

  • When should I use Areas in TFS instead of Team Projects

    - by Martin Hinshelwood
    Well, it depends…. If you are a small company that creates a finite number of internal projects then you will find it easier to create a single project for each of your products and have TFS do the heavy lifting with reporting, SharePoint sites and Version Control. But what if you are not… Update 9th March 2010 Michael Fourie gave me some feedback which I have integrated. Ed Blankenship via @edblankenship offered encouragement and a nice quote. Ewald Hofman gave me a couple of Cons, and maybe a few more soon. Ewald’s company, Avanade, currently uses Areas, but it looks like the manual management is getting too much and the project is getting cluttered. What if you are likely to have hundreds of projects, possibly with a multitude of internal and external projects? You might have 1 project for a customer or 10. This is the situation that most consultancies find themselves in and thus they need a more sustainable and maintainable option. What I am advocating is that we should have 1 “Team Project” per customer, and use areas to create “sub projects” within that single “Team Project”. "What you describe is what we generally do internally and what we recommend. We make very heavy use of area path to categorize the work within a larger project." - Brian Harry, Microsoft Technical Fellow & Product Unit Manager for Team Foundation Server   "We tend to use areas to segregate multiple projects in the same team project and it works well." - Tiago Pascoal, Visual Studio ALM MVP   "In general, I believe this approach provides consistency [to multi-product engagements] and lowers the administration and maintenance costs. All good." - Michael Fourie, Visual Studio ALM MVP   “@MrHinsh BTW, I'm very much a fan of very large, if not huge, team projects in TFS. Just FYI :) Use Areas & Iterations.” Ed Blankenship, Visual Studio ALM MVP   This would mean that SSW would have a single Team Project called “SSW” that contains all of our internal projects and consequently all of the Areas and Iteration move down one hierarchy to accommodate this. Where we would have had “\SSW\Sprint 1” we now have “\SSW\SqlDeploy\Sprint1” with “SqlDeploy” being our internal project. At the moment SSW has over 70 internal projects and more than 170 total projects in TFS. This method has long term benefits that help to simplify the support model for companies that often have limited internal support time and many projects. But, there are implications as TFS does not provide this model “out-of-the-box”. These implications stretch across Areas, Iterations, Queries, Project Portal and Version Control. Michael made a good comment, he said: I agree with your approach, assuming that in a multi-product engagement with a client, they are happy to adopt the same process template across all products. If they are not, then it’ll either be easy to convince them or there is a valid reason for having a different template - Michael Fourie, Visual Studio ALM MVP   At SSW we have a standard template that we use and this is applied across the board, to all of our projects. We even apply any changes to the core process template to all of our existing projects as well. If you have multiple projects for the same clients on multiple templates and you want to keep it that way, then this approach will not work for you. However, if you want to standardise as we have at SSW then this approach may benefit you as well. Implications around Areas Areas should be used for topological classification/isolation of work items. You can think of this as architecture areas, organisational areas or even the main features of your application. In our scenario there is an additional top level item that represents the Project / Product that we want to chop our Team Project into. Figure: Creating a sub area to represent a product/project is easy. <teamproject> <teamproject>\<Functional Area/module whatever> Becomes: <teamproject> <teamproject>\<ProjectName>\ <teamproject>\<ProjectName>\<Functional Area/module whatever> Implications around Iterations Iterations should be used for chronological classification/isolation of work items. This could include isolated time boxes, milestones or release timelines and really depends on the logical flow of your project or projects. Due to the new level in Area we need to add the same level to Iteration. This is primarily because it is unlikely that the sprints in each of your projects/products will start and end at the same time. This is just a reality of managing multiple projects. Figure: Adding the same Area value to Iteration as the top level item adds flexibility to Iteration. <teamproject>\Sprint 1 Or <teamproject>\Release 1\Sprint 1 Becomes: <teamproject>\<ProjectName>\Sprint 1 Or <teamproject>\<ProjectName>\Release 1\Sprint 1 Implications around Queries Queries are used to filter your work items based on a specified level of granularity. There are a number of queries that are built into a project created using the MSF Agile 5.0 template, but we now have multiple projects and it would be a pain to have to edit all of the work items every time we changed project, and that would only allow one team to work on one project at a time.   Figure: The Queries that are created in a normal MSF Agile 5.0 project do not quite suit our new needs. In order for project contributors to be able to query based on their project we need a couple of things. The first thing I did was to create an “_Area Template” folder that has a copy of the project layout with all the queries setup to filter based on the “_Area Template” Area and the “_Sprint template” you can see in the Area and Iteration views. Figure: The template is currently easily drag and drop, but you then need to edit the queries to point at the right Area and Iteration. This needs a tool. I then created an “Areas” folder to hold all of the area specific queries. So, when you go to create a new TFS Sub-Project you just drag “_Area Template” while holding “Ctrl” and drop it onto “Areas”. There is a little setup here. That said I managed it in around 10 minutes which is not so bad, and I can imagine it being quite easy to build a tool to create these queries Figure: These new queries can be configured in around 10 minutes, which includes setting up the Area and Iteration as well. Version Control What about your source code? Well, that is the easiest of the lot. Just create a sub folder for each of your projects/products.   Figure: Creating sub folders in source control is easy as “Right click | Create new folder”. <teamproject>\DEV\Main\ Becomes: <teamproject>\<ProjectName>\DEV\Main\ Conclusion I think it is up to each company to make a call on how you want to configure your Team Projects and it depends completely on how many projects/products you are going to have for each customer including yourself. If we decide to utilise this route it will require some configuration to get our 170+ projects into this format, and I will probably be writing some tools to help. Pros You only have one project to upgrade when a process template changes – After going through an upgrade of over 170 project prior to the changes in the RC I can tell you that that many projects is no fun. Standardises your Process Template – You will always have the same Process implementation across projects/products without exception You get tighter control over the permissions – Yes, you can do this on a standard Team Project, but it gets a lot easier with practice. You can “move” work items from one “product” to another – Have we not always wanted to do that. You can rename your projects – Wahoo: everyone wants to do this, now you can. One set of Reporting Services reports to manage – You set an area and iteration to run reports anyway, so you may as well set both. Simplified Check-In Policies– There is only one set of check-in policies per client. This simplifies administration of policies. Simplified Alerts – As alerts are applied across multiple projects this simplifies your alert rules as per client. Cons All of these cons could be mitigated by a custom tool that helps automate creation of “Sub-projects” within Team Projects. This custom tool could create areas, Iteration, permissions, SharePoint and queries. It just does not exist yet :) You need to configure the Areas and Iterations You need to configure the permissions You may need to configure sub sites for SharePoint (depends on your requirement) – If you have two projects/products in the same Team Project then you will not see the burn down for each one out-of-the-box, but rather a cumulative for the Team Project. This is not really that much of a problem as you would have to configure your burndown graphs for your current iteration anyway. note: When you create a sub site to a TFS linked portal it will inherit the settings of its parent site :) This is fantastic as it means that you can easily create sub sites and then set the Area and Iteration path in each of the reports to be the correct one. Every team wants their own customization (via Ewald Hofman) - small teams of 2 persons against teams of 30 – or even outsourcing – need their own process, you cannot allow that because everybody gets the same work item types. note: Luckily at SSW this is not a problem as our template is standardised across all projects and customers. Large list of builds (via Ewald Hofman) – As the build list in Team Explorer is just a flat list it can get very cluttered. note: I would mitigate this by removing any build that has not been run in over 30 days. The build template and workflow will still be available in version control, but it will clean the list. Feedback Now that I have explained this method, what do you think? What other pros and cons can you see? What do you think of this approach? Will you be using it? What tools would you like to support you?   Technorati Tags: Visual Studio ALM,TFS Administration,TFS,Team Foundation Server,Project Planning,TFS Customisation

    Read the article

  • top tweets WebLogic Partner Community – November 2011

    - by JuergenKress
    Send us your tweets @wlscommunity #WebLogicCommunity and follow us on twitter http://twitter.com/wlscommunity glassfish GlassFish Marek’s JAX-RS 2.0 content from Devoxx 2011 – bit.ly/sp2NJO chriscmuir chriscmuir New blog post: ADF bug: missing af:column borders in af:table for IE7 – t.co/81np2jug chriscmuir chriscmuir Reading: Oracle’s ADF Rich Client User Interface (RCUI) Guidelines – oracle.com/webfolder/ux/m… netbeans NetBeans Team Bottlenecks be gone! #Java Performance Tuning workshop in Munich w Kirk Pepperdine, Nov 29-Dec 2: ow.ly/7Akh5 OracleBlogs OracleBlogs Creating ADF Faces Comamnd Button at Runtime ow.ly/1fM9dE alexismp Alexis MP blogged "GlassFish Back from Devoxx 2011, Mature Java EE 6 and EE 7 well on its way" – bit.ly/rP8LV0 JDeveloper JDeveloper & ADF Usage of jQuery in ADF dlvr.it/x3t84 20 hours ago Favorite Retweet Reply OTNArchBeat OTNArchBeat Webcast: Introducing Oracle WebLogic Server 12c: Developer Deep Dive – Dec 1 – 11am PT / 2pm ET bit.ly/t61W4G oraclepartners ORCL PartnerNetwork Brand new Oracle WebLogic 12c will launch on December 1, 10AM PT with a global Webcast highlighting salient… t.co/aflQQ3IX OracleBlogs OracleBlogs JDeveloper and ADF at UKOUG t.co/2CQTiB9n fnimphiu Frank Nimphius Attending UKOUG? All ADF sessions at a glance: t.co/TcMNTMXp 21 Nov Favorite Retweet Reply JDeveloper JDeveloper & ADF Free Webinar ‘ADF Task Flows for Beginners’, information and registration t.co/66jXnGgo via javafx4you javafx4you Java Developer Workshop #2 – Dec 1, 2011 @ Oracle Aoyama center in Tokyo t.co/8p9q3W2B AMIS_Services AMIS Services #vacature #Oracle #ADF ontwikkelaars. bit.ly/AMISADF Gun jezelf een nieuwe uitdaging? Meer op: dld.bz/azZ5N OracleBlogs OracleBlogs Launch Invitation: Introducing Oracle WebLogic Server 12c t.co/bRxCKwAk fnimphiu Frank Nimphius The brand new WebLogic 12c will be released on December 1st 2011 !!! Register for online launch event t.co/pPScg4Xh glassfish GlassFish Announcing Oracle WebLogic 12c – t.co/qh8TdFEl AdamBien Adam Bien Sun Coding Conventions–The Only Standard (Stop Inventing): Code written according to the Sun Coding Conventions… t.co/qaUWp5Mz wlscommunity WebLogic Community Launch Invitation: Introducing Oracle WebLogic Server 12c wp.me/p1LMIb-4y andrejusb Andrejus Baranovskis Andrejus Baranovskis’s Blog: Custom Exception Registration for ADF BC EO Attribute fb.me/1m6nXQD52 MNEMONIC01 Michel Schildmeijer Blog by Michel Schildmeijer: "Oracle WebLogic 12c has been announced" bit.ly/vk6WQL glassfish GlassFish Tab Sweep – Coherence, SBT for GlassFish, OSGi in question, Java EE plugins, … t.co/tVIL95lj OracleBlogs OracleBlogs JavaFX 2.0 at Devoxx 2011 ow.ly/1fJ5iT JDeveloper JDeveloper & ADF Experimenting with ADF BC Application Module Pool Tuning dlvr.it/wjLC1 OracleWebLogic Oracle WebLogic Brand New #WebLogic 12c Launch Event, Dec 1 10am PT. Hasan Rizvi, SVP Fusion Middleware. Developer session. bit.ly/weblogic12clau… JDeveloper JDeveloper & ADF PopUp and Esc/Cancel operations. ADF 11g dlvr.it/whrmC JDeveloper JDeveloper & ADF BPM Workspace: issue loading ADF task flows t.co/vk1gKPx5 OpenJDK OpenJDK Kelly O’Hair — OpenJDK B24 Available : t.co/1bFws6Nw JDeveloper JDeveloper & ADF Oracle ADF setting Task flow to use same page definition file of caller page t.co/9k6UIoYZ JDeveloper JDeveloper & ADF Master Detail Data presentation and CRUD Operations. Detail records in an Editable Popup. ADF 11g t.co/H8uudR0Y JDeveloper JDeveloper & ADF Entity Attribute Validation Rule (Business Rule) based on Master View Object Attribute Example ADF 11g t.co/1agxEQcZ oracletechnet Justin Kestelyn Webcast: Oracle WebLogic Server 12c Launch/Developer Deep-Dive (Dec. 1) t.co/OVBdGKzC JDeveloper JDeveloper & ADF How to render different node icons for different tree levels dlvr.it/wY2jL JDeveloper JDeveloper & ADF Query Component with ‘dynamic’ view criteria dlvr.it/wXlF1 JDeveloper JDeveloper & ADF How to play Flash .swf file in Oracle ADF application t.co/zaSONWAH Devoxx Devoxx Duke at the #Devoxx 2011 Noxx Party! pic.twitter.com/bVJWyu1Z brhubart Bob Rhubart Adam Leftik: JavaEE adoption continues to increase, reaching 40+ million downloads this year. #qconsf11 JDeveloper JDeveloper & ADF Free #ODTUG Seminar – #ADF Task Flows for Beginners – sign up today. www3.gotomeeting.com/register/13372… java Java New Project: OpenJFX j.mp/tI4k3s #javafx #openjdk #devoxx << JavaFX is open source! /via frankmunz Frank Munz WebLogic 12c launch event Dec 1st. t.co/jQKinBqN brhubart Bob Rhubart Spring to Java EE Migration | David Heffelfinger feedly.com/k/td8ccG odtug ODTUG Mark your calendars and register for our upcoming webinars: bit.ly/dWKG1C ADF Task Flows & Measuring Scalability & Performance w/TCP myfear Markus Eisele Anybody willing to take this question? Using #JavaMail with #Weblogic Server bit.ly/stJOET AMIS_Services AMIS Services 20-22 december #training #Oracle JHeadstart #11g, productief ontwikkelen met ADF. Schrijf je in op: amis.nl/trainingen/ora… AdamBien Adam Bien Stress Testing Java EE 6 Applications – Free Article In Free Java Magazine: In the November / December 2011 issu… bit.ly/vmzKkc java Java New Tech Article: Spring to #JavaEE Migration t.co/0EvdHNxb OracleBlogs OracleBlogs WebLogic Java record SPARC T4-4 Servers Set World Record on SPECjEnterprise2010 t.co/Eu1b6ZE0 OracleBlogs OracleBlogs What Is JavaFX? ow.ly/1frb6I OTNArchBeat OTNArchBeat The openJDK Windows Binary Download | Adam Bien ow.ly/7fRiG wlscommunity WebLogic Community WebLogic – Java record – SPARC T4-4 Servers Set World Record on SPECjEnterprise2010 glassfish GlassFish "youtube.com/java" blogs.oracle.com/theaquarium/en… OTNArchBeat OTNArchBeat Beta Testing Concludes: 1Z1-102 – "Oracle WebLogic Server 11g: System Administration I" (Oracle Certification) ow.ly/7fJCl wlscommunity WebLogic Community A deep dive in Oracle WebLogic! @ Contribute – November 29th, 2011 Kontich Belgium wp.me/p1LMIb-4u glassfish GlassFish Gartner’s Latest Enterprise Application Server Magic Quadrant – Oracle’s leadership t.co/aYDqipD8 OpenJDK OpenJDK Terrence Barr – Open sourcing of JavaFX: OpenJFX Project proposed – bit.ly/uKVnEl OpenJDK OpenJDK Maurizio Cimadamore – Testing overload resolution: bit.ly/vgXAbQ java Java Java User Groups Roundup, November 2011 : t.co/hea6vVnk /via @robilad << in German JavaSpotlight The Java Spotlight Java Spotlight Episode 54: Stuart Marks on the Coinification of JDK7 goo.gl/fb/3UXoM OTNArchBeat OTNArchBeat Article Series: Migrating Spring to Java EE 6 | Arun Gupta bit.ly/twUJtz glassfish GlassFish New Java EE 6 Hands-On lab, Devoxx-approved! bit.ly/vup5uE java Java Brian Goetz’s enthusiasm for Java is palpable! #devoxx interview adf_emg ADF EMG "ADF testing with a mock framework" – what is a mock framework? Visit the forum and see: groups.google.com/forum/#!topic/… java Java Taping a bunch of interviews today with Java experts at #devoxx. View on Parleys.com tomorrow. glassfish GlassFish New screencast to configure and run a cross-machine cluster using GlassFish 3.1.1 in < 7 mins faissalb.blogspot.com/2011/11/glassf… (via @bfaissal) glassfish GlassFish Oracle Contributor Agreements – New Home! bit.ly/tD2eLo OTNArchBeat OTNArchBeat Java Magazine – by and for the Java Community- inaugural issue bit.ly/tTv8UD OTNArchBeat OTNArchBeat The Heroes of Java: Michael Hüttermann | @MyFear bit.ly/rYYOFe javafx4you javafx4you Development with #JavaFX on #Linux j.mp/uOpe69 #not_for_the_faint_of_heart java Java Contribute Technical Questions for Java Experts at #devoxx bit.ly/up2cN0 netbeans NetBeans Team A simple REST service using #NetBeans 7, #Java Servlet, and #JAXB: t.co/pKkufsD8 AdamBien Adam Bien The most beautiful, and portable slide of the whole #jaxcon for "Die Hard Java EE 6"session checked-in: kenai.com/projects/javae… jaxlondon JAX London Mark Little’s (@nmcl) excellent keynote from #jaxlondon ‘Middleware Everywhere…’ is available in full – t.co/8vBmtDJ1 AdamBien Adam Bien Calculator sample from "Die Hard Java EE 6" #jaxcon session checked-in: t.co/0UqaULfg OTNArchBeat OTNArchBeat ADF Faces – a logic bomb in the order of bean instantiations | @ChrisCMuir bit.ly/vjqRaZ OracleBlogs OracleBlogs ODI 11g y JMS Queue de Weblogic ow.ly/1fzfQJ frankmunz Frank Munz Which WebLogic book do you recommend? Review of S. Alapati’s WebLogic 11g Administration Handbook. bit.ly/rP0RtW JDeveloper JDeveloper & ADF PageFlowScope with Unbounded Task Flows: the magic sauce for multi-browser-tab support in JDeveloper ADF applications dlvr.it/vNFgn OracleBlogs OracleBlogs 3 New ADF Insider Essential training videos published. ow.ly/1fz94q OracleBlogs OracleBlogs Weblogic Server 11gR1 PS2: Administration Essentials book and eBook t.co/ykzwIaqs OracleBlogs OracleBlogs Specialized Partners Only! New Service to Promote Your Events t.co/qTgyEpY4 wlscommunity WebLogic Community Oracle Weblogic Server 11gR1 PS2: Administration Essentials book and eBook andrejusb Andrejus Baranovskis Andrejus Baranovskis’s Blog: Stress Testing Oracle ADF BC Applications – Intern… andrejusb.blogspot.com/2011/11/stress… OracleBlogs OracleBlogs Frank Nimphius presenting a full day of Oracle ADF in Switzerland ow.ly/1fxU78 java Java #JavaEE and #GlassFish: #JavaOne11 Slides, Demos, Replays, Hands-on Labs t.co/tLM0ehrD OracleBlogs OracleBlogs weblogic.security.SecurityInitializationException: Authentication for user weblogic denied ow.ly/1fxmiu glassfish GlassFish The Last Migration – GlassFish Wiki : t.co/Dc5FT1SJ OTNArchBeat OTNArchBeat A Successful Year of @MiddlewareMagic t.co/amcGGTTk OracleWebLogic Oracle WebLogic Unbeatable Performance for your Cloud Applications with Exalogic, #OracleCoherence and #WebLogic. ow.ly/7lYKm OTNArchBeat OTNArchBeat Stress Testing Oracle ADF BC Applications – Passivation and Activation | @AndrejusB bit.ly/sASssL OTNArchBeat OTNArchBeat Review: "Oracle Weblogic Server 11gR1 PS2: Administration Essentials" by Michel Schildmeijer | @MyFear t.co/ll6ra0J9 OTNArchBeat OTNArchBeat GlassFish 3.1.2 themes and features | The Aquarium bit.ly/vVqr9r Andre_van_Dalen Andre van Dalen Masterclass: Advanced Oracle ADF 11g lnkd.in/M_45Pi AdamBien Adam Bien The "lunch" edition of RentACar is pushed into: kenai.com/projects/javae… #wjax AdamBien Adam Bien In munich, room munich at #wjax. Welcome to #javaee workshop. Gather your questions. 15 minutes to go lucasjellema Lucas Jellema Review by Markus of Michel’s book: t.co/41U9wvOb In short: valuable for novice WLS users, maybe not so much for die-hard WLS admin. biemond Edwin Biemond “@myfear: [blog] #Review: "#Weblogic Server 11gR1 PS2: Administration Essentials" t.co/LsODcb3e” got the same conclusion on amazon glassfish GlassFish Practical advice for deploying Lift apps to GlassFish: bit.ly/t3KUml glassfish GlassFish The unbearable lightness of GlassFish t.co/v9307SEJ javafx4you javafx4you Building Java EE applications in JavaFX: JavaFX 2.0, FXML and Spring j.mp/tiMDUh andrejusb Andrejus Baranovskis Andrejus Baranovskis’s Blog: Stress Testing Oracle ADF BC Applications – Passiv… andrejusb.blogspot.com/2011/11/stress… wlscommunity WebLogic Community “@AMIS_Services: Follow @amis_services To Win a copy of SOA Suite 11g Handbook by @lucasjellema dld.bz/axD22 pls RT” excellent book! glassfish GlassFish GlassFish 3.1.2 themes and features bit.ly/uEc6uZ biemond Edwin Biemond Weblogic pre-sales exam was hard, you really need to know the versions , upgrade path and have a score above 80% monkchips James Governor The Rise and Fall and Rise of Java. JAX 2011 london keynote. how big data and the web are floating the boat. slidesha.re/u3Kzlo glassfish GlassFish Tab Sweep – Jersey, Hudson, GlassFish Hosting, GC’s compared, Spring to JavaEE, Modularity, … bit.ly/u9Cc30 oracletechnet Justin Kestelyn Oracle Tuxedo: A renewed acquaintance t.co/gp0mmf20 OTNArchBeat OTNArchBeat Oracle Enterprise Pack for Eclipse, OEPE 11.1.1.8 bit.ly/tC3eKp OracleBlogs OracleBlogs NetBeans HTML Editor and Groovy Editor in a Multiview Component (Part 2) ow.ly/1ftCeI myfear Markus Eisele [blog] #Oracle 2008 – 2011 in Gartners Magic Quadrant for Enterprise Application Servers t.co/2Bs1vgMZ myfear Markus Eisele [blog] #EclipseCon Europe – Java 7 in the Enterprise goo.gl/fb/r80df #ece2011 #java7 javafx4you javafx4you JavaFX 2.0 for Mac build b07 (developer preview) is available for download j.mp/vSwmBP Enjoy! #JavaFX #Mac OracleBlogs OracleBlogs A deep dive in Oracle WebLogic! @ Contribute November 29th, 2011 Kontich Belgium ow.ly/1fsEZs arungupta Arun Gupta #JavaEE7 slides from #jaxlondon and #jfall11 now available: slidesha.re/sh4iFq AdamBien Adam Bien Just checked-in the results of the #jaxlondon community night (somehow beer related): kenai.com/projects/javae… glassfish GlassFish GlassFish Podcast Episode #080 – User Stories, Part 3: Adam Bien and Sean Comerford (ESPN) blogs.oracle.com/glassfishpodca… glassfish GlassFish Story: t.co/jQPqihJb using GlassFish blogs.oracle.com/stories/entry/… "3000+ requests/sec" and more enterprisejava Java EE Mentions New blog post WebLogic deployment status checks for CI wp.me/pOOSs-F #weblogic #continuousintegration /vi… bit.ly/uZz0fk The become a member in the WebLogic Partner Community please first login at http://partner.oracle.com and then visit: http://www.oracle.com/partners/goto/wls-emea Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: twitter,WebLogic,WebLogic Community,OPN,Oracle,Jürgen Kress

    Read the article

  • Uploading and Importing CSV file to SQL Server in ASP.NET WebForms

    - by Vincent Maverick Durano
    Few weeks ago I was working with a small internal project  that involves importing CSV file to Sql Server database and thought I'd share the simple implementation that I did on the project. In this post I will demonstrate how to upload and import CSV file to SQL Server database. As some may have already know, importing CSV file to SQL Server is easy and simple but difficulties arise when the CSV file contains, many columns with different data types. Basically, the provider cannot differentiate data types between the columns or the rows, blindly it will consider them as a data type based on first few rows and leave all the data which does not match the data type. To overcome this problem, I used schema.ini file to define the data type of the CSV file and allow the provider to read that and recognize the exact data types of each column. Now what is schema.ini? Taken from the documentation: The Schema.ini is a information file, used to define the data structure and format of each column that contains data in the CSV file. If schema.ini file exists in the directory, Microsoft.Jet.OLEDB provider automatically reads it and recognizes the data type information of each column in the CSV file. Thus, the provider intelligently avoids the misinterpretation of data types before inserting the data into the database. For more information see: http://msdn.microsoft.com/en-us/library/ms709353%28VS.85%29.aspx Points to remember before creating schema.ini:   1. The schema information file, must always named as 'schema.ini'.   2. The schema.ini file must be kept in the same directory where the CSV file exists.   3. The schema.ini file must be created before reading the CSV file.   4. The first line of the schema.ini, must the name of the CSV file, followed by the properties of the CSV file, and then the properties of the each column in the CSV file. Here's an example of how the schema looked like: [Employee.csv] ColNameHeader=False Format=CSVDelimited DateTimeFormat=dd-MMM-yyyy Col1=EmployeeID Long Col2=EmployeeFirstName Text Width 100 Col3=EmployeeLastName Text Width 50 Col4=EmployeeEmailAddress Text Width 50 To get started lets's go a head and create a simple blank database. Just for the purpose of this demo I created a database called TestDB. After creating the database then lets go a head and fire up Visual Studio and then create a new WebApplication project. Under the root application create a folder called UploadedCSVFiles and then place the schema.ini on that folder. The uploaded CSV files will be stored in this folder after the user imports the file. Now add a WebForm in the project and set up the HTML mark up and add one (1) FileUpload control one(1)Button and three (3) Label controls. After that we can now proceed with the codes for uploading and importing the CSV file to SQL Server database. Here are the full code blocks below: 1: using System; 2: using System.Data; 3: using System.Data.SqlClient; 4: using System.Data.OleDb; 5: using System.IO; 6: using System.Text; 7:   8: namespace WebApplication1 9: { 10: public partial class CSVToSQLImporting : System.Web.UI.Page 11: { 12: private string GetConnectionString() 13: { 14: return System.Configuration.ConfigurationManager.ConnectionStrings["DBConnectionString"].ConnectionString; 15: } 16: private void CreateDatabaseTable(DataTable dt, string tableName) 17: { 18:   19: string sqlQuery = string.Empty; 20: string sqlDBType = string.Empty; 21: string dataType = string.Empty; 22: int maxLength = 0; 23: StringBuilder sb = new StringBuilder(); 24:   25: sb.AppendFormat(string.Format("CREATE TABLE {0} (", tableName)); 26:   27: for (int i = 0; i < dt.Columns.Count; i++) 28: { 29: dataType = dt.Columns[i].DataType.ToString(); 30: if (dataType == "System.Int32") 31: { 32: sqlDBType = "INT"; 33: } 34: else if (dataType == "System.String") 35: { 36: sqlDBType = "NVARCHAR"; 37: maxLength = dt.Columns[i].MaxLength; 38: } 39:   40: if (maxLength > 0) 41: { 42: sb.AppendFormat(string.Format(" {0} {1} ({2}), ", dt.Columns[i].ColumnName, sqlDBType, maxLength)); 43: } 44: else 45: { 46: sb.AppendFormat(string.Format(" {0} {1}, ", dt.Columns[i].ColumnName, sqlDBType)); 47: } 48: } 49:   50: sqlQuery = sb.ToString(); 51: sqlQuery = sqlQuery.Trim().TrimEnd(','); 52: sqlQuery = sqlQuery + " )"; 53:   54: using (SqlConnection sqlConn = new SqlConnection(GetConnectionString())) 55: { 56: sqlConn.Open(); 57: SqlCommand sqlCmd = new SqlCommand(sqlQuery, sqlConn); 58: sqlCmd.ExecuteNonQuery(); 59: sqlConn.Close(); 60: } 61:   62: } 63: private void LoadDataToDatabase(string tableName, string fileFullPath, string delimeter) 64: { 65: string sqlQuery = string.Empty; 66: StringBuilder sb = new StringBuilder(); 67:   68: sb.AppendFormat(string.Format("BULK INSERT {0} ", tableName)); 69: sb.AppendFormat(string.Format(" FROM '{0}'", fileFullPath)); 70: sb.AppendFormat(string.Format(" WITH ( FIELDTERMINATOR = '{0}' , ROWTERMINATOR = '\n' )", delimeter)); 71:   72: sqlQuery = sb.ToString(); 73:   74: using (SqlConnection sqlConn = new SqlConnection(GetConnectionString())) 75: { 76: sqlConn.Open(); 77: SqlCommand sqlCmd = new SqlCommand(sqlQuery, sqlConn); 78: sqlCmd.ExecuteNonQuery(); 79: sqlConn.Close(); 80: } 81: } 82: protected void Page_Load(object sender, EventArgs e) 83: { 84:   85: } 86: protected void BTNImport_Click(object sender, EventArgs e) 87: { 88: if (FileUpload1.HasFile) 89: { 90: FileInfo fileInfo = new FileInfo(FileUpload1.PostedFile.FileName); 91: if (fileInfo.Name.Contains(".csv")) 92: { 93:   94: string fileName = fileInfo.Name.Replace(".csv", "").ToString(); 95: string csvFilePath = Server.MapPath("UploadedCSVFiles") + "\\" + fileInfo.Name; 96:   97: //Save the CSV file in the Server inside 'MyCSVFolder' 98: FileUpload1.SaveAs(csvFilePath); 99:   100: //Fetch the location of CSV file 101: string filePath = Server.MapPath("UploadedCSVFiles") + "\\"; 102: string strSql = "SELECT * FROM [" + fileInfo.Name + "]"; 103: string strCSVConnString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + filePath + ";" + "Extended Properties='text;HDR=YES;'"; 104:   105: // load the data from CSV to DataTable 106:   107: OleDbDataAdapter adapter = new OleDbDataAdapter(strSql, strCSVConnString); 108: DataTable dtCSV = new DataTable(); 109: DataTable dtSchema = new DataTable(); 110:   111: adapter.FillSchema(dtCSV, SchemaType.Mapped); 112: adapter.Fill(dtCSV); 113:   114: if (dtCSV.Rows.Count > 0) 115: { 116: CreateDatabaseTable(dtCSV, fileName); 117: Label2.Text = string.Format("The table ({0}) has been successfully created to the database.", fileName); 118:   119: string fileFullPath = filePath + fileInfo.Name; 120: LoadDataToDatabase(fileName, fileFullPath, ","); 121:   122: Label1.Text = string.Format("({0}) records has been loaded to the table {1}.", dtCSV.Rows.Count, fileName); 123: } 124: else 125: { 126: LBLError.Text = "File is empty."; 127: } 128: } 129: else 130: { 131: LBLError.Text = "Unable to recognize file."; 132: } 133:   134: } 135: } 136: } 137: } The code above consists of three (3) private methods which are the GetConnectionString(), CreateDatabaseTable() and LoadDataToDatabase(). The GetConnectionString() is a method that returns a string. This method basically gets the connection string that is configured in the web.config file. The CreateDatabaseTable() is method that accepts two (2) parameters which are the DataTable and the filename. As the method name already suggested, this method automatically create a Table to the database based on the source DataTable and the filename of the CSV file. The LoadDataToDatabase() is a method that accepts three (3) parameters which are the tableName, fileFullPath and delimeter value. This method is where the actual saving or importing of data from CSV to SQL server happend. The codes at BTNImport_Click event handles the uploading of CSV file to the specified location and at the same time this is where the CreateDatabaseTable() and LoadDataToDatabase() are being called. If you notice I also added some basic trappings and validations within that event. Now to test the importing utility then let's create a simple data in a CSV format. Just for the simplicity of this demo let's create a CSV file and name it as "Employee" and add some data on it. Here's an example below: 1,VMS,Durano,[email protected] 2,Jennifer,Cortes,[email protected] 3,Xhaiden,Durano,[email protected] 4,Angel,Santos,[email protected] 5,Kier,Binks,[email protected] 6,Erika,Bird,[email protected] 7,Vianne,Durano,[email protected] 8,Lilibeth,Tree,[email protected] 9,Bon,Bolger,[email protected] 10,Brian,Jones,[email protected] Now save the newly created CSV file in some location in your hard drive. Okay let's run the application and browse the CSV file that we have just created. Take a look at the sample screen shots below: After browsing the CSV file. After clicking the Import Button Now if we look at the database that we have created earlier you'll notice that the Employee table is created with the imported data on it. See below screen shot.   That's it! I hope someone find this post useful! Technorati Tags: ASP.NET,CSV,SQL,C#,ADO.NET

    Read the article

  • CodePlex Daily Summary for Wednesday, April 28, 2010

    CodePlex Daily Summary for Wednesday, April 28, 2010New ProjectsArgument Handler: This project aims to help the handling of arguments in command line programs.Bing for BlackBerry: Bing for BlackBerry is a SDK that allows intergration and searching Bing in there applications.C4F - GeoWallpaper: This is an extension for my MEF Utility Runner to change desktop wallpaper based on Flickr images geotagged with your current location. Uses Windo...CRM 4.0 Contract Utilities: List of Contract Utilities (i.e. custom workflow actions) 1. Change contract status from Active to Draft 2. Copy Contract (with custom start/end da...ELIS: Multimedia player based on WPFEnterprise Administration with Powershell: The EnterpriseShell aims to produce a powershell code library that will enable Enterprise Administrators to quickly reconfigure their IT infrastruc...ExposedObject: ExposedObject uses dynamic typing in C# to provide convenient access to private fields and methods from code outside of the class – for testing, ex...F# Project Extender: Installing F# Project Extender provides tools to better organize files in F# projects by allowing project subdirectories and separating file manage...Hack Framework: Code bundle with the internets brains connected into one piece of .Net frameworkKrypton XNA: Krypton allows users of the XNA framework to easily add 2D lighting to their games. Krypton is fast, as it utilizes the GPU and uses a vertex shade...Net Darts: Provides an easy way to calculate the score left to throw. Users can easily click on the score.PlayerSharp: PlayerSharp is a library written entirely in C# that allows you to communicate your C# programs with the Player Server by Brian Gerkey et al (http:...Ratpoid: RatpoidRedeemer Tower Defense: 2d tower defense game. It is developed with XNA technology, using .Net Visual Studio 2008 or .Net Visual Studio 2010SelfService: Simple self service projectSharePoint Exchange Calendar: a jQuery based calendar web part for displaying Exchange calendars within SharePoint.SharpORM, easy use Object & Relation Database mapping Library: AIM on: Object easy storeage, Create,Retrieve,Update,Delete User .NET Attribute marking or Xml Standalone Mapping eg. public class Somethin...Silverlight Calculator: Silverlight Calculator makes it easier for people to do simple math calculations. It's developed in C# Silverlight 2. Silverlight Calendar: Silverlight Calendar makes it easier for people to view a calendar in silverlight way. It's developed in C# Silverlight 2.SPSocialUtil for SharePoint 2010: SPSocialUtil makes it easier for delevoper to use Social Tag, Tag Cloud, BookMark, Colleague in SharePoint 2010. It's developed in C# with Visual S...Sublight: Sublight open source project is a simple metadata utility for view models in ASP.NET MVC 2Veda Auto Dealer Report: Create work item tasks for the Veda Auto Dealer ReportWPF Meta-Effects: WPF Meta-Effects makes it easier for shader effect developpers to develop and maintain shader code. You'll no longer have to write any HLSL, instea...New ReleasesBing for BlackBerry: Bing SDK for BlackBerry: There are four downloadable components: The library, in source code format, in its latest stable release. A "getting started" doc that will gui...DotNetNuke® Store: 02.01.34: What's New in this release? Bugs corrected: - Fixed a bug related to encryption cookie when DNN is used in Medium Trust environment. New Features:...Encrypted Notes: Encrypted Notes 1.6.4: This is the latest version of Encrypted Notes, with general improvements and bug fixes for 'Batch Encryption'. It has an installer that will create...EPiServer CMS Page Type Builder: Page Type Builder 1.2 Beta 2: For more information about this release check out this blog post.ExposedObject: ExposedObject 0.1: This is an initial release of the ExposedObject library that lets you conveniently call private methods and access private fields of a class.Extended SSIS Package Execute: Ver 0.01: Version 0.01 - 2008 Compatible OnlyF# Project Extender: V0.9.0.0 (VS2008): F# project extender for Visual Studio 2008. Initial ReleaseFileExplorer.NET: FileExplorer.NET 1.0: This is the first release of this project. ---------------------------------------------------------------- Please report any bugs that you may en...Fluent ViewModel Configuration for WPF (MVVM): FluentViewModel Alpha3: Overhaul of the configuration system Separation of the configuratior and service locator Added support for general services - using Castle Wind...Home Access Plus+: v4.1: v4.1 Change Log: booking system fixes/additions Uploader fixes Added excluded extensions in my computer Updated Config Tool Fixed an issue ...ILMerge-GUI, merge .NET assemblies: 1.9.0 BETA: Compatible with .NET 4.0 This is the first version of ILMerge-GUI working with .NET 4.0. The final version will be 2.0.0, to be released by mid May...iTuner - The iTunes Companion: iTuner 1.2.3769 Beta 3c: Given the high download rate and awesome feedback for Beta 3b, I decided to release this interim build with the following enhancements: Added new h...Jet Login Tool (JetLoginTool): In then Out - 1.5.3770.18310: Fixed: Will only attempt to logout if currently logged in Fixed: UI no longer blocks while waiting for log outKooboo CMS: Kooboo CMS 2.1.1.0: New features Add new API RssUrl to generate RSS link, this is an extension to UrlHelper. Add possibility to index and search attachment content ...Krypton XNA: Krypton v1.0: First release of Krypton. I've tried to throw together a small testbed, but just getting tortoisehq to work the first time around was a huge pain. ...LinkedIn® for Windows Mobile: LinkedIn for Windows Mobile v0.5: Added missing files to installer packageNito.KitchenSink: Version 7: New features (since Version 5) Fixed null reference bug in ExceptionExtensions. Added DynamicStaticTypeMembers and RefOutArg for dynamically (lat...Numina Application/Security Framework: Numina.Framework Core 51341: Added multiple methods to API and classic ASP libraryOpenIdPortableArea: 0.1.0.3 OpenIdPortableArea: OpenIdPortableArea.Release: DotNetOpenAuth.dll DotNetOpenAuth.xml MvcContrib.dll MvcContrib.xml OpenIdPortableArea.dll OpenIdPortableAre...Play-kanaler (Windows Media Center Plug-in): Playkanaler 1.0.5 Alpha: Playkanaler version 1.0.5 Alpha Skärmsläckar-fix. helt otestad!PokeIn Comet Ajax Library: PokeIn v0.81 Library with ServerWatch Sample: Release Notes Functionality improved. Possible bugs fixed. Realtime server time sample addedSharpORM, easy use Object & Relation Database mapping Library: Vbyte.SharpOrm 1.0.2010.427: Vbyte.SharpOrm for Access & SQLServer.7z 1.0 alpha release for Access oledb provider and Sql Server 2000+.Silverlight 4.0 Popup Menu: Context Menu for Silverlight 4.0: - Markup items can now be added seperately using the AddItem method. Alternatingly all items can be placed inside a listbox which can then be added...Silverlight Calculator: SilverCalculator: SilverCalculator version 1.0 Live demoSilverlight Calendar: Silverlight Calendar: Silverlight Calendar version 1.0 Live demoSpeakup Desktop Frontend: Speakup Desktop Frontend v0.2a: This is new version of Speakup Desktop Frontend. It requires .net 4.0 to be installed before using it. In this release next changes were done: - ...TwitterVB - A .NET Twitter Library: Twitter-2.5: Adds xAuth support and increases TwitterLocation information (html help file is not up to date, will correct in a later version.)VCC: Latest build, v2.1.30427.5: Automatic drop of latest buildXP-More: 1.1: Added a parent VHD edit feature. Moved VM settings from double-click to a button, and rearranged the buttons' layout.Yasbg: It's Static 1.0: Many changes have been made from the previous release. Read the README! This release adds settings tab and fixes bugs. To run, first unzip, then...Most Popular ProjectsRawrWBFS Managerpatterns & practices – Enterprise LibraryAJAX Control ToolkitSilverlight ToolkitMicrosoft SQL Server Product Samples: DatabaseWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesPHPExcelMost Active ProjectsRawrpatterns & practices – Enterprise LibraryGMap.NET - Great Maps for Windows Forms & PresentationNB_Store - Free DotNetNuke Ecommerce Catalog ModuleIonics Isapi Rewrite FilterParticle Plot PivotFarseer Physics EngineBlogEngine.NETDotNetZip LibrarySqlDiffFramework-A Visual Differencing Engine for Dissimilar Data Sources

    Read the article

  • CodePlex Daily Summary for Sunday, December 05, 2010

    CodePlex Daily Summary for Sunday, December 05, 2010Popular ReleasesSubtitleTools: SubtitleTools 1.0: First public releaseMiniTwitter: 1.62: MiniTwitter 1.62 ???? ?? ??????????????????????????????????????? 140 ?????????????????????????? ???????????????????????????????? ?? ??????????????????????????????????Phalanger - The PHP Language Compiler for the .NET Framework: 2.0 (December 2010): The release is targetted for stable daily use. With improved performance and enhanced compatibility with several latest PHP open source applications; it makes this release perfect replacement of your old PHP runtime. Changes made within this release include following and much more: Performance improvements based on real-world applications experience. We determined biggest bottlenecks and we found and removed overheads causing performance problems in many PHP applications. Reimplemented nat...Chronos WPF: Chronos v2.0 Beta 3: Release notes: Updated introduction document. Updated Visual Studio 2010 Extension (vsix) package. Added horizontal scrolling to the main window TaskBar. Added new styles for ListView, ListViewItem, GridViewColumnHeader, ... Added a new WindowViewModel class (allowing to fetch data). Added a new Navigate method (with several overloads) to the NavigationViewModel class (protected). Reimplemented Task usage for the WorkspaceViewModel.OnDelete method. Removed the reflection effect...MDownloader: MDownloader-0.15.26.7024: Fixed updater; Fixed MegauploadDJ - jQuery WebControls for ASP.NET: DJ 1.2: What is new? Update to support jQuery 1.4.2 Update to support jQuery ui 1.8.6 Update to Visual Studio 2010 New WebControls with samples added Autocomplete WebControl Button WebControl ToggleButt WebControl The example web site is including in source code project.LateBindingApi.Excel: LateBindingApi.Excel Release 0.7g: Unterschiede zur Vorgängerversion: - Zusätzliche Interior Properties - Group / Ungroup Methoden für Range - Bugfix COM Reference Handling für Application Objekt in einigen Klassen Release+Samples V0.7g: - Enthält Laufzeit DLL und Beispielprojekte Beispielprojekte: COMAddinExample - Demonstriert ein versionslos angebundenes COMAddin Example01 - Background Colors und Borders für Cells Example02 - Font Attributes undAlignment für Cells Example03 - Numberformats Example04 - Shapes, WordArts, P...ESRI ArcGIS Silverlight Toolkit: November 2010 - v2.1: ESRI ArcGIS Silverlight Toolkit v2.1 Added Windows Phone 7 build. New controls added: InfoWindow ChildPage (Windows Phone 7 only) See what's new here full details for : http://help.arcgis.com/en/webapi/silverlight/help/#/What_s_new_in_2_1/016600000025000000/ Note: Requires Visual Studio 2010, .NET 4.0 and Silverlight 4.0.ASP .NET MVC CMS (Content Management System): Atomic CMS 2.1.1: Atomic CMS 2.1.1 release notes Atomic CMS installation guide Winware: Winware 3.0 (.Net 4.0): Winware 3.0 is base on .Net 4.0 with C#. Please open it with Visual Studio 2010. This release contains a lab web application.UltimateJB: UltimateJB 2.02 PL3 KAKAROTO + CE-X-3.41 EvilSperm: Voici une version attendu avec impatience pour beaucoup : - La Version CEX341 pour pouvoir jouer avec des jeux demandant le firmware 3.50 ( certain ne fonctionne tous simplement pas ). - Pour l'instant le CEX341 n'est disponible qu'avec les PS3 en firmwares 3.41 !!! - La version PL3 KAKAROTO intégre ses dernières modification et intégre maintenant le firmware 3.30 !!! Conclusion : - UltimateJB CEX341 => Spoof le Firmware 3.41 en 3.50 ( facilite l'utilisation de certain jeux avec openManage...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.5 beta Released: Hi, Today we are releasing Visifire 3.6.5 beta with the following new feature: New property AutoFitToPlotArea has been introduced in DataSeries. AutoFitToPlotArea will bring bubbles inside the PlotArea in order to avoid clipping of bubbles in bubble chart. Also this release includes few bug fixes: AxisXLabel label were getting clipped if angle was set for AxisLabels and ScrollingEnabled was not set in Chart. If LabelStyle property was set as 'Inside', size of the Pie was not proper. Yo...EnhSim: EnhSim 2.1.1: 2.1.1This release adds in the changes for 4.03a. To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Switched Searing Flames bac...AI: Initial 0.0.1: It’s simply just one code file; it simulates AI and machine in a simulated world. The AI has a little understanding of its body machine and parts, and able to use its feet to do actions just start and stop walking. The world is all of white with nothing but just the machine on a white planet. Colors, odors and position information make no sense. I’m previous C# programmer and I’m learning F# during this project, although I’m still not a good F# programmer, in this project I learning to prog...NKinect: NKinect Preview: Build features: Accelerometer reading Motor serial number property Realtime image update Realtime depth calculation Export to PLY (On demand) Control motor LED Control Kinect tiltMicrosoft - Domain Oriented N-Layered .NET 4.0 App Sample (Microsoft Spain): V1.0 - N-Layer DDD Sample App .NET 4.0: Required Software (Microsoft Base Software needed for Development environment) Visual Studio 2010 RTM & .NET 4.0 RTM (Final Versions) Expression Blend 4 SQL Server 2008 R2 Express/Standard/Enterprise Unity Application Block 2.0 - Published May 5th 2010 http://www.microsoft.com/downloads/en/details.aspx?FamilyID=2D24F179-E0A6-49D7-89C4-5B67D939F91B&displaylang=en http://unity.codeplex.com/releases/view/31277 PEX & MOLES 0.94.51023.0, 29/Oct/2010 - Visual Studio 2010 Power Tools http://re...Sense/Net Enterprise Portal & ECMS: SenseNet 6.0.1 Community Edition: Sense/Net 6.0.1 Community Edition This half year we have been working quite fiercely to bring you the long-awaited release of Sense/Net 6.0. Download this Community Edition to see what we have been up to. These months we have worked on getting the WebCMS capabilities of Sense/Net 6.0 up to par. New features include: New, powerful page and portlet editing experience. HTML and CSS cleanup, new, powerful site skinning system. Upgraded, lightning-fast indexing and query via Lucene. Limita...Minecraft GPS: Minecraft GPS 1.1.1: New Features Compass! New style. Set opacity on main window to allow overlay of Minecraft. Open World in any folder. Fixes Fixed style so listbox won't grow the window size. Fixed open file dialog issue on non-vista kernel machines.DotSpatial: DotSpatial 11-28-2001: This release introduces some exciting improvements. Support for big raster, both in display and changing the scheme. Faster raster scheme creation for all rasters. Caching of the "sample" values so once obtained the raster symbolizer dialog loads faster. Reprojection supported for raster and image classes. Affine transform fully supported for images and rasters, so skewed images are now possible. Projection uses better checks when loading unprojected layers. GDAL raster support f...SuperWebSocket: SuperWebSocket(60438): It is the first release of SuperWebSocket. Because it is base on SuperSocket, most features of SuperSocket are supported in SuperWebSocket. The source code include a LiveChat demo.New ProjectsBambook???: ????Bambook???????。Beespot: Beespot is an easy to use, secure, robust and powerful Honeypot for the SSH Service written in Python. caitanzhangDemo: this is my demoColorPicker [SA:MP]: ColorPicker [SA:MP] is a simple tool that generates: - PAWN Hex Color Codes (useful for SAMP Scripts); - ARGB Color Codes; - HTML Color Codes; It's developed in C#.Conversions-n-Stuff: Conversions-n-Stuff (CNS) is a program focused on making it easier for anyone to convert from one measurement to another. There is no need to know the calculations and formulas! Just fill in the forms and click, and you have your answer! CNS leverages C#, WPF, and Silverlight.dotP2P: dotP2P would consist of servers running caches to keep track of domain and nameserver records. Cache servers can be created with any server that supports XML-RPC or SOAP. MySQL is used to store the the cache data. EmailMasterTemplate: This user master and child user control based email template engine.Ezekiel: Ezekiel is a Windows application that leverages a user's existing BusinessObjects reports to provide a custom read-only front end for a database. It's developed using Visual C# 2010 Express.F# Colorizer Editor: Standalone Colorizer Editor for Brian's Fsharp Deep Colorizer VS ExtensionGameCore: Core engine for game services for mobile and RIA clientsGestão de contas bancárias: Trabalho final de Matematica Aplicada da UATLAGPP: GPPkmean: Kmeans ClusteringPHP ORM: ??????orm???,??PHP????,??????????????!Steampunk Odyssey: Steampunk Odyssey is a side-scrolling action game based on the XNA platformSubtitleTools: SubtitleTools is a small utility that helps modifying existing subtitles or downloading new ones based on the digital signatures of your movie files from opensubtitles.org site.Windows Phone 7 Accelerometer: Accelerometer for Windows Phone 7???: ????????????????????????

    Read the article

  • Jframe using multiple classes?

    - by user2945880
    and im trying to make it so it can show multiple classes at once Jframe: import javax.swing.JFrame; import java.awt.BorderLayout; public class Concert { public static void main(String[] args) { JFrame frame = new JFrame(); frame.setSize(1000, 800); frame.setTitle("Concert!"); frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); Concertbackground component = new Concertbackground(); BandComponent component1 = new BandComponent(); frame.add(component, BorderLayout.NORTH); frame.add(component1, BorderLayout.CENTER); frame.setVisible(true); } } These are the two classes mentioned in the Jframe: import java.awt.Color; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.Rectangle; import java.awt.geom.Ellipse2D; import java.awt.geom.Line2D; import javax.swing.JComponent; import java.awt.Polygon; /* BandComponent.java Justin Walker 10/27/13 */ public class BandComponent extends JComponent { public void paintComponent(Graphics g) { // Recover Graphics2D Graphics2D g2 = (Graphics2D) g; int xScale = 250; int yScale = 100; int x = 343; int y = 343; //singer Polygon sing = new Polygon(); sing.addPoint(667 ,208 + xScale); sing.addPoint(676,213 + xScale); sing.addPoint(678,217 + xScale); sing.addPoint(682,221 + xScale); sing.addPoint(681,224 + xScale); sing.addPoint(680,231 + xScale); sing.addPoint(676,242 + xScale); sing.addPoint(672,244 + xScale); sing.addPoint(672,250 + xScale); sing.addPoint(682,248 + xScale); sing.addPoint(713,244 + xScale); sing.addPoint(734,247 + xScale); sing.addPoint(750,247 + xScale); sing.addPoint(794,232 + xScale); sing.addPoint(800,231 + xScale); sing.addPoint(801,223 + xScale); sing.addPoint(807,219 + xScale); sing.addPoint(806,221 + xScale); sing.addPoint(806,229 + xScale); sing.addPoint(818,222 + xScale); sing.addPoint(820,223 + xScale); sing.addPoint(825,227 + xScale); sing.addPoint(825,240 + xScale); sing.addPoint(817,243 + xScale); sing.addPoint(807,245 + xScale); sing.addPoint(803,247 + xScale); sing.addPoint(801,252 + xScale); sing.addPoint(781,257 + xScale); sing.addPoint(762,264 + xScale); sing.addPoint(734,271 + xScale); sing.addPoint(701,286 + xScale); sing.addPoint(691,296 + xScale); sing.addPoint(693,311 + xScale); sing.addPoint(690,317 + xScale); sing.addPoint(690,335 + xScale); sing.addPoint(691,339 + xScale); sing.addPoint(689,343 + xScale); sing.addPoint(712,382 + xScale); sing.addPoint(725,400 + xScale); sing.addPoint(731,418 + xScale); sing.addPoint(731,428 + xScale); sing.addPoint(738,454 + xScale); sing.addPoint(741,460 + xScale); sing.addPoint(746,468 + xScale); sing.addPoint(766,468 + xScale); sing.addPoint(771,481 + xScale);// sing.addPoint(723,482 + xScale); sing.addPoint(720,462 + xScale); sing.addPoint(718,454 + xScale); sing.addPoint(709,436 + xScale); sing.addPoint(703,436 + xScale); sing.addPoint(699,417 + xScale); sing.addPoint(686,396 + xScale); sing.addPoint(678,395 + xScale); sing.addPoint(676,437 + xScale); sing.addPoint(673,439 + xScale); sing.addPoint(638,435 + xScale); sing.addPoint(640,398 + xScale); sing.addPoint(634,410 + xScale); sing.addPoint(625,416 + xScale); sing.addPoint(622,436 + xScale); sing.addPoint(622,443 + xScale); sing.addPoint(615,447 + xScale); sing.addPoint(609,456 + xScale); sing.addPoint(606,481 + xScale);// sing.addPoint(557,481 + xScale); sing.addPoint(560,467 + xScale); sing.addPoint(579,467 + xScale); sing.addPoint(587,464 + xScale); sing.addPoint(593,452 + xScale); sing.addPoint(594,441 + xScale); sing.addPoint(592,434 + xScale); sing.addPoint(600,416 + xScale); sing.addPoint(608,405 + xScale); sing.addPoint(609,394 + xScale); sing.addPoint(617,376 + xScale); sing.addPoint(619,363 + xScale); sing.addPoint(632,334 + xScale); sing.addPoint(637,324 + xScale); sing.addPoint(635,314 + xScale); sing.addPoint(639,296 + xScale); sing.addPoint(627,285 + xScale); sing.addPoint(600,279 + xScale); sing.addPoint(582,278 + xScale); sing.addPoint(575,275 + xScale); sing.addPoint(546,256 + xScale); sing.addPoint(536,252 + xScale); sing.addPoint(533,350 + xScale); sing.addPoint(534,361 + xScale); sing.addPoint(532,367 + xScale); sing.addPoint(529,369 + xScale); sing.addPoint(524,363 + xScale); sing.addPoint(525,355 + xScale); sing.addPoint(531,254 + xScale); sing.addPoint(527,249 + xScale); sing.addPoint(527,242 + xScale); sing.addPoint(529,237 + xScale); sing.addPoint(532,237 + xScale); sing.addPoint(536,178 + xScale); sing.addPoint(534,129 + xScale); sing.addPoint(535,123 + xScale); sing.addPoint(541,120 + xScale); sing.addPoint(545,123 + xScale); sing.addPoint(547,131 + xScale); sing.addPoint(545,173 + xScale); sing.addPoint(538,233 + xScale); sing.addPoint(549,239 + xScale); sing.addPoint(558,241 + xScale); sing.addPoint(585,257 + xScale); sing.addPoint(599,257 + xScale); sing.addPoint(627,254 + xScale); sing.addPoint(647,251 + xScale); sing.addPoint(653,248 + xScale); sing.addPoint(652,235 + xScale); sing.addPoint(648,226 + xScale); sing.addPoint(652,218 + xScale); sing.addPoint(661,212 + xScale); g2.setColor(Color.black); g2.fill(sing); g2.draw(sing); //guitar Polygon guitar = new Polygon(); guitar.addPoint(148,28); guitar.addPoint(158,32); guitar.addPoint(164,38); guitar.addPoint(168,46); guitar.addPoint(169,52); guitar.addPoint(167,60); guitar.addPoint(164,65); guitar.addPoint(165,70); guitar.addPoint(161,76); guitar.addPoint(158,92); guitar.addPoint(162,97); guitar.addPoint(161,102); guitar.addPoint(158,106); guitar.addPoint(155,108); guitar.addPoint(151,127); guitar.addPoint(152,133); guitar.addPoint(155,137); guitar.addPoint(151,146); guitar.addPoint(153,147); guitar.addPoint(160,142); guitar.addPoint(162,133); guitar.addPoint(162,123); guitar.addPoint(161,113); guitar.addPoint(162,110); guitar.addPoint(164,117); guitar.addPoint(169,131); guitar.addPoint(171,144); guitar.addPoint(170,159); guitar.addPoint(166,167); guitar.addPoint(166,171); guitar.addPoint(174,174); guitar.addPoint(183,184); guitar.addPoint(191,195); guitar.addPoint(196,198); guitar.addPoint(198,200); guitar.addPoint(199,210); guitar.addPoint(211,225); guitar.addPoint(212,233); guitar.addPoint(220,248); guitar.addPoint(233,260); guitar.addPoint(245,266); guitar.addPoint(248,268); guitar.addPoint(249,277); guitar.addPoint(205,275); guitar.addPoint(204,262); guitar.addPoint(187,238); guitar.addPoint(178,224); guitar.addPoint(177,216); guitar.addPoint(156,201); guitar.addPoint(146,197); guitar.addPoint(134,211); guitar.addPoint(128,229); guitar.addPoint(125,244);// guitar.addPoint(121,246); guitar.addPoint(107,248); guitar.addPoint(100,252); guitar.addPoint(97,258); guitar.addPoint(96,253); guitar.addPoint(89,258); guitar.addPoint(65,267); guitar.addPoint(63,274); guitar.addPoint(64,283); guitar.addPoint(41,282); guitar.addPoint(44,270); guitar.addPoint(47,264); guitar.addPoint(51,255); guitar.addPoint(73,238); guitar.addPoint(79,228); guitar.addPoint(97,222); guitar.addPoint(101,204); guitar.addPoint(102,181); guitar.addPoint(100,170); guitar.addPoint(95,161); guitar.addPoint(97,154); guitar.addPoint(91,152); guitar.addPoint(77,131); guitar.addPoint(65,123); guitar.addPoint(61,105); guitar.addPoint(64,94); guitar.addPoint(72,91); guitar.addPoint(78,82); guitar.addPoint(78,76); guitar.addPoint(70,73); guitar.addPoint(70,67); guitar.addPoint(93,51); guitar.addPoint(101,48); guitar.addPoint(111,52); guitar.addPoint(118,59); guitar.addPoint(119,70); guitar.addPoint(117,78); guitar.addPoint(113,79); guitar.addPoint(112,86); guitar.addPoint(111,88); guitar.addPoint(109,89); guitar.addPoint(109,92); guitar.addPoint(122,99);// guitar.addPoint(124,99); guitar.addPoint(133,96); guitar.addPoint(145,93); //guitar.addPoint(138,124); guitar.addPoint(150,69); guitar.addPoint(150,62); guitar.addPoint(155,58); guitar.addPoint(154,53); guitar.addPoint(149,50); guitar.addPoint(154,46); guitar.addPoint(153,38); guitar.addPoint(147,28); g2.setColor(Color.black); g2.fill(guitar); g2.draw(guitar); Polygon guitar2 = new Polygon (); guitar2.addPoint(141,108); guitar2.addPoint(139,126); guitar2.addPoint(135,122); guitar2.addPoint(128,122); guitar2.addPoint(129,116); guitar2.addPoint(143,108); g2.setColor(Color.white); g2.fill(guitar2); g2.draw(guitar2); //bass guitar Polygon bassgt = new Polygon (); bassgt.addPoint(871,21); bassgt.addPoint(879,24); bassgt.addPoint(885,32); bassgt.addPoint(886,42); bassgt.addPoint(895,47); bassgt.addPoint(904,56); bassgt.addPoint(907,69); bassgt.addPoint(909,83); bassgt.addPoint(910,91); bassgt.addPoint(941,81); bassgt.addPoint(946,75); bassgt.addPoint(945,67); bassgt.addPoint(950,67); bassgt.addPoint(955,75); bassgt.addPoint(960,68); bassgt.addPoint(963,74); bassgt.addPoint(967,72); bassgt.addPoint(971,66); bassgt.addPoint(973,70); bassgt.addPoint(981,67); bassgt.addPoint(984,71); bassgt.addPoint(982,76); bassgt.addPoint(987,80); bassgt.addPoint(986,82); bassgt.addPoint(980,83); bassgt.addPoint(979,90); bassgt.addPoint(974,85); bassgt.addPoint(970,86); bassgt.addPoint(973,91); bassgt.addPoint(965,86); bassgt.addPoint(960,90); bassgt.addPoint(961,100); bassgt.addPoint(955,92); bassgt.addPoint(944,91); bassgt.addPoint(907,103); bassgt.addPoint(906,109); bassgt.addPoint(893,114); bassgt.addPoint(895,123); bassgt.addPoint(900,131); bassgt.addPoint(904,134); bassgt.addPoint(908,145); bassgt.addPoint(911,159); bassgt.addPoint(918,171); bassgt.addPoint(919,190); bassgt.addPoint(923,198); bassgt.addPoint(919,201); bassgt.addPoint(919,210); bassgt.addPoint(927,220); bassgt.addPoint(942,226); bassgt.addPoint(944,234); bassgt.addPoint(909,230); bassgt.addPoint(905,214); bassgt.addPoint(899,204); bassgt.addPoint(893,203); bassgt.addPoint(889,171); bassgt.addPoint(877,151); bassgt.addPoint(861,152); bassgt.addPoint(852,169); bassgt.addPoint(849,203); bassgt.addPoint(841,210); bassgt.addPoint(840,228); bassgt.addPoint(828,233); bassgt.addPoint(806,235); bassgt.addPoint(805,228); bassgt.addPoint(822,219); bassgt.addPoint(824,204); bassgt.addPoint(817,201); bassgt.addPoint(822,196); bassgt.addPoint(822,184); bassgt.addPoint(828,162); bassgt.addPoint(829,152); bassgt.addPoint(820,149); bassgt.addPoint(811,144); bassgt.addPoint(806,134); bassgt.addPoint(805,117); bassgt.addPoint(820,107); bassgt.addPoint(819,89); bassgt.addPoint(811,83); bassgt.addPoint(811,77); bassgt.addPoint(824,66); bassgt.addPoint(825,61); bassgt.addPoint(842,53); bassgt.addPoint(852,43); bassgt.addPoint(853,29); bassgt.addPoint(870,20); g2.setColor(Color.black); g2.fill(bassgt); g2.draw(bassgt); Polygon bassgt2 = new Polygon(); bassgt2.addPoint(845,78); bassgt2.addPoint(845,98); bassgt2.addPoint(843,98); bassgt2.addPoint(842,105); bassgt2.addPoint(839,109); bassgt2.addPoint(834,103); bassgt2.addPoint(832,85); bassgt2.addPoint(845,78); g2.setColor(Color.white); g2.fill(bassgt2); g2.draw(bassgt2); Polygon drums = new Polygon (); drums.addPoint(713,104); drums.addPoint(706,121); drums.addPoint(721,377); drums.addPoint(248,380); drums.addPoint(253,228); drums.addPoint(250,206); drums.addPoint(237,178); drums.addPoint(206,166); drums.addPoint(201,154); drums.addPoint(198,152); drums.addPoint(208,148); drums.addPoint(236,150); drums.addPoint(247,130); drums.addPoint(227,119); drums.addPoint(219,105); drums.addPoint(222,96); drums.addPoint(233,88); drums.addPoint(251,84); drums.addPoint(272,83); drums.addPoint(300,91); drums.addPoint(285,72); drums.addPoint(294,57); drums.addPoint(319,46); drums.addPoint(372,45); drums.addPoint(406,50); drums.addPoint(428,65); drums.addPoint(433,74); drums.addPoint(450,58); drums.addPoint(478,48); drums.addPoint(514,48); drums.addPoint(544,51); drums.addPoint(566,52); drums.addPoint(577,67); drums.addPoint(575,79); drums.addPoint(561,95); drums.addPoint(545,98); drums.addPoint(525,105); drums.addPoint(524,147); drums.addPoint(524,183); drums.addPoint(645,175); drums.addPoint(662,143); drums.addPoint(617,152); drums.addPoint(608,148); drums.addPoint(614,139); drums.addPoint(633,128); drums.addPoint(661,116); drums.addPoint(659,107); drums.addPoint(625,114); drums.addPoint(592,113); drums.addPoint(571,111); drums.addPoint(565,102); drums.addPoint(576,86); drums.addPoint(616,70); drums.addPoint(647,66); drums.addPoint(679,67); drums.addPoint(695,72); drums.addPoint(699,90); drums.addPoint(678,100); drums.addPoint(667,103); drums.addPoint(672,113); drums.addPoint(689,105); drums.addPoint(709,106); g2.setColor(Color.black); g2.fill(drums); g2.draw(drums); } } The second class: import java.awt.Color; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.Rectangle; import java.awt.geom.Ellipse2D; import java.awt.geom.Line2D; import javax.swing.JComponent; import java.awt.GradientPaint; /* component that draws the concert background */ public class Concertbackground extends JComponent { public void paintComponent(Graphics g) { super.paintComponent(g); // Recover Graphics2D Graphics2D g2 = (Graphics2D) g; //Background Top g2.setColor(Color.BLUE); Rectangle backgroundTop = new Rectangle (0, 0, getWidth(), getHeight() / 4); g2.fill(backgroundTop); // Background bottom g2.setColor(Color.GREEN); Rectangle backgroundBottom = new Rectangle (0, getHeight() / 2, getWidth(), getHeight() / 2); g2.fill(backgroundBottom); // Speaker base g2.setColor(Color.BLACK); Rectangle base = new Rectangle (0, 0, 50, 100); g2.fill(base); // Speakers circles gray top g2.setColor(Color.DARK_GRAY); Ellipse2D.Double speakerTop = new Ellipse2D.Double(10, 10, 30, 30); g2.fill(speakerTop); //speakers circles black top g2.setColor(Color.BLACK); Ellipse2D.Double speakerTop1 = new Ellipse2D.Double(15, 15, 20, 20); g2.fill(speakerTop1); // Speakers circles gray bottom g2.setColor(Color.DARK_GRAY); Ellipse2D.Double speakerBottom = new Ellipse2D.Double(10, 50, 30, 30); g2.fill(speakerBottom); //speakers circles black bottom g2.setColor(Color.BLACK); Ellipse2D.Double speakerBottom1 = new Ellipse2D.Double(15, 55, 20, 20); g2.fill(speakerBottom1); } } My main question is how do I change my Jframe so it can use as many classes as I want, It cant be the size of my classes because they were used with the same 1000, 800 Jframe to make the classes. I also need to be able to add more than just these two classes to my Jframe.

    Read the article

  • Strategy and AI for the game 'Proximity'

    - by smci
    'Proximity' is a strategy game of territorial domination similar to Othello, Go and Risk. Two players, uses a 10x12 hex grid. Game invented by Brian Cable in 2007. Seems to be a worthy game for discussing a) optimal strategy then b) how to build an AI Strategies are going to be probabilistic or heuristic-based, due to the randomness factor, and the high branching factor (starts out at 120). So it will be kind of hard to compare objectively. A compute time limit of 5s per turn seems reasonable. Game: Flash version here and many copies elsewhere on the web Rules: here Object: to have control of the most armies after all tiles have been placed. Each turn you received a randomly numbered tile (value between 1 and 20 armies) to place on any vacant board space. If this tile is adjacent to any ally tiles, it will strengthen each tile's defenses +1 (up to a max value of 20). If it is adjacent to any enemy tiles, it will take control over them if its number is higher than the number on the enemy tile. Thoughts on strategy: Here are some initial thoughts; setting the computer AI to Expert will probably teach a lot: minimizing your perimeter seems to be a good strategy, to prevent flips and minimize worst-case damage like in Go, leaving holes inside your formation is lethal, only more so with the hex grid because you can lose armies on up to 6 squares in one move low-numbered tiles are a liability, so place them away from your main territory, near the board edges and scattered. You can also use low-numbered tiles to plug holes in your formation, or make small gains along the perimeter which the opponent will not tend to bother attacking. a triangle formation of three pieces is strong since they mutually reinforce, and also reduce the perimeter Each tile can be flipped at most 6 times, i.e. when its neighbor tiles are occupied. Control of a formation can flow back and forth. Sometimes you lose part of a formation and plug any holes to render that part of the board 'dead' and lock in your territory/ prevent further losses. Low-numbered tiles are obvious-but-low-valued liabilities, but high-numbered tiles can be bigger liabilities if they get flipped (which is harder). One lucky play with a 20-army tile can cause a swing of 200 (from +100 to -100 armies). So tile placement will have both offensive and defensive considerations. Comment 1,2,4 seem to resemble a minimax strategy where we minimize the maximum expected possible loss (modified by some probabilistic consideration of the value ß the opponent can get from 1..20 i.e. a structure which can only be flipped by a ß=20 tile is 'nearly impregnable'.) I'm not clear what the implications of comments 3,5,6 are for optimal strategy. Interested in comments from Go, Chess or Othello players. (The sequel ProximityHD for XBox Live, allows 4-player -cooperative or -competitive local multiplayer increases the branching factor since you now have 5 tiles in your hand at any given time, of which you can only play one. Reinforcement of ally tiles is increased to +2 per ally.)

    Read the article

  • Find optimal strategy and AI for the game 'Proximity'?

    - by smci
    'Proximity' is a strategy game of territorial domination similar to Othello, Go and Risk. Two players, uses a 10x12 hex grid. Game invented by Brian Cable in 2007. Seems to be a worthy game for discussing a) optimal algorithm then b) how to build an AI. Strategies are going to be probabilistic or heuristic-based, due to the randomness factor, and the insane branching factor (20^120). So it will be kind of hard to compare objectively. A compute time limit of 5s per turn seems reasonable. Game: Flash version here and many copies elsewhere on the web Rules: here Object: to have control of the most armies after all tiles have been placed. Each turn you received a randomly numbered tile (value between 1 and 20 armies) to place on any vacant board space. If this tile is adjacent to any ally tiles, it will strengthen each tile's defenses +1 (up to a max value of 20). If it is adjacent to any enemy tiles, it will take control over them if its number is higher than the number on the enemy tile. Thoughts on strategy: Here are some initial thoughts; setting the computer AI to Expert will probably teach a lot: minimizing your perimeter seems to be a good strategy, to prevent flips and minimize worst-case damage like in Go, leaving holes inside your formation is lethal, only more so with the hex grid because you can lose armies on up to 6 squares in one move low-numbered tiles are a liability, so place them away from your main territory, near the board edges and scattered. You can also use low-numbered tiles to plug holes in your formation, or make small gains along the perimeter which the opponent will not tend to bother attacking. a triangle formation of three pieces is strong since they mutually reinforce, and also reduce the perimeter Each tile can be flipped at most 6 times, i.e. when its neighbor tiles are occupied. Control of a formation can flow back and forth. Sometimes you lose part of a formation and plug any holes to render that part of the board 'dead' and lock in your territory/ prevent further losses. Low-numbered tiles are obvious-but-low-valued liabilities, but high-numbered tiles can be bigger liabilities if they get flipped (which is harder). One lucky play with a 20-army tile can cause a swing of 200 (from +100 to -100 armies). So tile placement will have both offensive and defensive considerations. Comment 1,2,4 seem to resemble a minimax strategy where we minimize the maximum expected possible loss (modified by some probabilistic consideration of the value ß the opponent can get from 1..20 i.e. a structure which can only be flipped by a ß=20 tile is 'nearly impregnable'.) I'm not clear what the implications of comments 3,5,6 are for optimal strategy. Interested in comments from Go, Chess or Othello players. (The sequel ProximityHD for XBox Live, allows 4-player -cooperative or -competitive local multiplayer increases the branching factor since you now have 5 tiles in your hand at any given time, of which you can only play one. Reinforcement of ally tiles is increased to +2 per ally.)

    Read the article

  • Find optimal/good-enough strategy and AI for the game 'Proximity'?

    - by smci
    'Proximity' is a strategy game of territorial domination similar to Othello, Go and Risk. Two players, uses a 10x12 hex grid. Game invented by Brian Cable in 2007. Seems to be a worthy game for discussing a) optimal algorithm then b) how to build an AI. Strategies are going to be probabilistic or heuristic-based, due to the randomness factor, and the insane branching factor (20^120). So it will be kind of hard to compare objectively. A compute time limit of 5s per turn seems reasonable. Game: Flash version here and many copies elsewhere on the web Rules: here Object: to have control of the most armies after all tiles have been placed. Each turn you received a randomly numbered tile (value between 1 and 20 armies) to place on any vacant board space. If this tile is adjacent to any ally tiles, it will strengthen each tile's defenses +1 (up to a max value of 20). If it is adjacent to any enemy tiles, it will take control over them if its number is higher than the number on the enemy tile. Thoughts on strategy: Here are some initial thoughts; setting the computer AI to Expert will probably teach a lot: minimizing your perimeter seems to be a good strategy, to prevent flips and minimize worst-case damage like in Go, leaving holes inside your formation is lethal, only more so with the hex grid because you can lose armies on up to 6 squares in one move low-numbered tiles are a liability, so place them away from your main territory, near the board edges and scattered. You can also use low-numbered tiles to plug holes in your formation, or make small gains along the perimeter which the opponent will not tend to bother attacking. a triangle formation of three pieces is strong since they mutually reinforce, and also reduce the perimeter Each tile can be flipped at most 6 times, i.e. when its neighbor tiles are occupied. Control of a formation can flow back and forth. Sometimes you lose part of a formation and plug any holes to render that part of the board 'dead' and lock in your territory/ prevent further losses. Low-numbered tiles are obvious-but-low-valued liabilities, but high-numbered tiles can be bigger liabilities if they get flipped (which is harder). One lucky play with a 20-army tile can cause a swing of 200 (from +100 to -100 armies). So tile placement will have both offensive and defensive considerations. Comment 1,2,4 seem to resemble a minimax strategy where we minimize the maximum expected possible loss (modified by some probabilistic consideration of the value ß the opponent can get from 1..20 i.e. a structure which can only be flipped by a ß=20 tile is 'nearly impregnable'.) I'm not clear what the implications of comments 3,5,6 are for optimal strategy. Interested in comments from Go, Chess or Othello players. (The sequel ProximityHD for XBox Live, allows 4-player -cooperative or -competitive local multiplayer increases the branching factor since you now have 5 tiles in your hand at any given time, of which you can only play one. Reinforcement of ally tiles is increased to +2 per ally.)

    Read the article

  • Guide to reduce TFS database growth using the Test Attachment Cleaner

    - by terje
    Recently there has been several reports on TFS databases growing too fast and growing too big.  Notable this has been observed when one has started to use more features of the Testing system.  Also, the TFS 2010 handles test results differently from TFS 2008, and this leads to more data stored in the TFS databases. As a consequence of this there has been released some tools to remove unneeded data in the database, and also some fixes to correct for bugs which has been found and corrected during this process.  Further some preventive practices and maintenance rules should be adopted. A lot of people have blogged about this, among these are: Anu’s very important blog post here describes both the problem and solutions to handle it.  She describes both the Test Attachment Cleaner tool, and also some QFE/CU releases to fix some underlying bugs which prevented the tool from being fully effective. Brian Harry’s blog post here describes the problem too This forum thread describes the problem with some solution hints. Ravi Shanker’s blog post here describes best practices on solving this (TBP) Grant Holidays blogpost here describes strategies to use the Test Attachment Cleaner both to detect space problems and how to rectify them.   The problem can be divided into the following areas: Publishing of test results from builds Publishing of manual test results and their attachments in particular Publishing of deployment binaries for use during a test run Bugs in SQL server preventing total cleanup of data (All the published data above is published into the TFS database as attachments.) The test results will include all data being collected during the run.  Some of this data can grow rather large, like IntelliTrace logs and video recordings.   Also the pushing of binaries which happen for automated test runs, including tests run during a build using code coverage which will include all the files in the deployment folder, contributes a lot to the size of the attached data.   In order to handle this systematically, I have set up a 3-stage process: Find out if you have a database space issue Set up your TFS server to minimize potential database issues If you have the “problem”, clean up the database and otherwise keep it clean   Analyze the data Are your database( s) growing ?  Are unused test results growing out of proportion ? To find out about this you need to query your TFS database for some of the information, and use the Test Attachment Cleaner (TAC) to obtain some  more detailed information. If you don’t have too many databases you can use the SQL Server reports from within the Management Studio to analyze the database and table sizes. Or, you can use a set of queries . I find queries often faster to use because I can tweak them the way I want them.  But be aware that these queries are non-documented and non-supported and may change when the product team wants to change them. If you have multiple Project Collections, find out which might have problems: (Disclaimer: The queries below work on TFS 2010. They will not work on Dev-11, since the table structure have been changed.  I will try to update them for Dev-11 when it is released.) Open a SQL Management Studio session onto the SQL Server where you have your TFS Databases. Use the query below to find the Project Collection databases and their sizes, in descending size order.  use master select DB_NAME(database_id) AS DBName, (size/128) SizeInMB FROM sys.master_files where type=0 and substring(db_name(database_id),1,4)='Tfs_' and DB_NAME(database_id)<>'Tfs_Configuration' order by size desc Doing this on one of our SQL servers gives the following results: It is pretty easy to see on which collection to start the work   Find out which tables are possibly too large Keep a special watch out for the Tfs_Attachment table. Use the script at the bottom of Grant’s blog to find the table sizes in descending size order. In our case we got this result: From Grant’s blog we learnt that the tbl_Content is in the Version Control category, so the major only big issue we have here is the tbl_AttachmentContent.   Find out which team projects have possibly too large attachments In order to use the TAC to find and eventually delete attachment data we need to find out which team projects have these attachments. The team project is a required parameter to the TAC. Use the following query to find this, replace the collection database name with whatever applies in your case:   use Tfs_DefaultCollection select p.projectname, sum(a.compressedlength)/1024/1024 as sizeInMB from dbo.tbl_Attachment as a inner join tbl_testrun as tr on a.testrunid=tr.testrunid inner join tbl_project as p on p.projectid=tr.projectid group by p.projectname order by sum(a.compressedlength) desc In our case we got this result (had to remove some names), out of more than 100 team projects accumulated over quite some years: As can be seen here it is pretty obvious the “Byggtjeneste – Projects” are the main team project to take care of, with the ones on lines 2-4 as the next ones.  Check which attachment types takes up the most space It can be nice to know which attachment types takes up the space, so run the following query: use Tfs_DefaultCollection select a.attachmenttype, sum(a.compressedlength)/1024/1024 as sizeInMB from dbo.tbl_Attachment as a inner join tbl_testrun as tr on a.testrunid=tr.testrunid inner join tbl_project as p on p.projectid=tr.projectid group by a.attachmenttype order by sum(a.compressedlength) desc We then got this result: From this it is pretty obvious that the problem here is the binary files, as also mentioned in Anu’s blog. Check which file types, by their extension, takes up the most space Run the following query use Tfs_DefaultCollection select SUBSTRING(filename,len(filename)-CHARINDEX('.',REVERSE(filename))+2,999)as Extension, sum(compressedlength)/1024 as SizeInKB from tbl_Attachment group by SUBSTRING(filename,len(filename)-CHARINDEX('.',REVERSE(filename))+2,999) order by sum(compressedlength) desc This gives a result like this:   Now you should have collected enough information to tell you what to do – if you got to do something, and some of the information you need in order to set up your TAC settings file, both for a cleanup and for scheduled maintenance later.    Get your TFS server and environment properly set up Even if you have got the problem or if have yet not got the problem, you should ensure the TFS server is set up so that the risk of getting into this problem is minimized.  To ensure this you should install the following set of updates and components. The assumption is that your TFS Server is at SP1 level. Install the QFE for KB2608743 – which also contains detailed instructions on its use, download from here. The QFE changes the default settings to not upload deployed binaries, which are used in automated test runs. Binaries will still be uploaded if: Code coverage is enabled in the test settings. You change the UploadDeploymentItem to true in the testsettings file. Be aware that this might be reset back to false by another user which haven't installed this QFE. The hotfix should be installed to The build servers (the build agents) The machine hosting the Test Controller Local development computers (Visual Studio) Local test computers (MTM) It is not required to install it to the TFS Server, test agents or the build controller – it has no effect on these programs. If you use the SQL Server 2008 R2 you should also install the CU 10 (or later).  This CU fixes a potential problem of hanging “ghost” files.  This seems to happen only in certain trigger situations, but to ensure it doesn’t bite you, it is better to make sure this CU is installed. There is no such CU for SQL Server 2008 pre-R2 Work around:  If you suspect hanging ghost files, they can be – with some mental effort, deduced from the ghost counters using the following SQL query: use master SELECT DB_NAME(database_id) as 'database',OBJECT_NAME(object_id) as 'objectname', index_type_desc,ghost_record_count,version_ghost_record_count,record_count,avg_record_size_in_bytes FROM sys.dm_db_index_physical_stats (DB_ID(N'<DatabaseName>'), OBJECT_ID(N'<TableName>'), NULL, NULL , 'DETAILED') The problem is a stalled ghost cleanup process.  Restarting the SQL server after having stopped all components that depends on it, like the TFS Server and SPS services – that is all applications that connect to the SQL server. Then restart the SQL server, and finally start up all dependent processes again.  (I would guess a complete server reboot would do the trick too.) After this the ghost cleanup process will run properly again. The fix will come in the next CU cycle for SQL Server R2 SP1.  The R2 pre-SP1 and R2 SP1 have separate maintenance cycles, and are maintained individually. Each have its own set of CU’s. When it comes I will add the link here to that CU. The "hanging ghost file” issue came up after one have run the TAC, and deleted enourmes amount of data.  The SQL Server can get into this hanging state (without the QFE) in certain cases due to this. And of course, install and set up the Test Attachment Cleaner command line power tool.  This should be done following some guidelines from Ravi Shanker: “When you run TAC, ensure that you are deleting small chunks of data at regular intervals (say run TAC every night at 3AM to delete data that is between age 730 to 731 days) – this will ensure that small amounts of data are being deleted and SQL ghosted record cleanup can catch up with the number of deletes performed. “ This rule minimizes the risk of the ghosted hang problem to occur, and further makes it easier for the SQL server ghosting process to work smoothly. “Run DBCC SHRINKDB post the ghosted records are cleaned up to physically reclaim the space on the file system” This is the last step in a 3 step process of removing SQL server data. First they are logically deleted. Then they are cleaned out by the ghosting process, and finally removed using the shrinkdb command. Cleaning out the attachments The TAC is run from the command line using a set of parameters and controlled by a settingsfile.  The parameters point out a server uri including the team project collection and also point at a specific team project. So in order to run this for multiple team projects regularly one has to set up a script to run the TAC multiple times, once for each team project.  When you install the TAC there is a very useful readme file in the same directory. When the deployment binaries are published to the TFS server, ALL items are published up from the deployment folder. That often means much more files than you would assume are necessary. This is a brute force technique. It works, but you need to take care when cleaning up. Grant has shown how their settings file looks in his blog post, removing all attachments older than 180 days , as long as there are no active workitems connected to them. This setting can be useful to clean out all items, both in a clean-up once operation, and in a general There are two scenarios we need to consider: Cleaning up an existing overgrown database Maintaining a server to avoid an overgrown database using scheduled TAC   1. Cleaning up a database which has grown too big due to these attachments. This job is a “Once” job.  We do this once and then move on to make sure it won’t happen again, by taking the actions in 2) below.  In this scenario you should only consider the large files. Your goal should be to simply reduce the size, and don’t bother about  the smaller stuff. That can be left a scheduled TAC cleanup ( 2 below). Here you can use a very general settings file, and just remove the large attachments, or you can choose to remove any old items.  Grant’s settings file is an example of the last one.  A settings file to remove only large attachments could look like this: <!-- Scenario : Remove large files --> <DeletionCriteria> <TestRun /> <Attachment> <SizeInMB GreaterThan="10" /> </Attachment> </DeletionCriteria> Or like this: If you want only to remove dll’s and pdb’s about that size, add an Extensions-section.  Without that section, all extensions will be deleted. <!-- Scenario : Remove large files of type dll's and pdb's --> <DeletionCriteria> <TestRun /> <Attachment> <SizeInMB GreaterThan="10" /> <Extensions> <Include value="dll" /> <Include value="pdb" /> </Extensions> </Attachment> </DeletionCriteria> Before you start up your scheduled maintenance, you should clear out all older items. 2. Scheduled maintenance using the TAC If you run a schedule every night, and remove old items, and also remove them in small batches.  It is important to run this often, like every night, in order to keep the number of deleted items low. That way the SQL ghost process works better. One approach could be to delete all items older than some number of days, let’s say 180 days. This could be combined with restricting it to keep attachments with active or resolved bugs.  Doing this every night ensures that only small amounts of data is deleted. <!-- Scenario : Remove old items except if they have active or resolved bugs --> <DeletionCriteria> <TestRun> <AgeInDays OlderThan="180" /> </TestRun> <Attachment /> <LinkedBugs> <Exclude state="Active" /> <Exclude state="Resolved"/> </LinkedBugs> </DeletionCriteria> In my experience there are projects which are left with active or resolved workitems, akthough no further work is done.  It can be wise to have a cleanup process with no restrictions on linked bugs at all. Note that you then have to remove the whole LinkedBugs section. A approach which could work better here is to do a two step approach, use the schedule above to with no LinkedBugs as a sweeper cleaning task taking away all data older than you could care about.  Then have another scheduled TAC task to take out more specifically attachments that you are not likely to use. This task could be much more specific, and based on your analysis clean out what you know is troublesome data. <!-- Scenario : Remove specific files early --> <DeletionCriteria> <TestRun > <AgeInDays OlderThan="30" /> </TestRun> <Attachment> <SizeInMB GreaterThan="10" /> <Extensions> <Include value="iTrace"/> <Include value="dll"/> <Include value="pdb"/> <Include value="wmv"/> </Extensions> </Attachment> <LinkedBugs> <Exclude state="Active" /> <Exclude state="Resolved" /> </LinkedBugs> </DeletionCriteria> The readme document for the TAC says that it recognizes “internal” extensions, but it does recognize any extension. To run the tool do the following command: tcmpt attachmentcleanup /collection:your_tfs_collection_url /teamproject:your_team_project /settingsfile:path_to_settingsfile /outputfile:%temp%/teamproject.tcmpt.log /mode:delete   Shrinking the database You could run a shrink database command after the TAC has run in cases where there are a lot of data being deleted.  In this case you SHOULD do it, to free up all that space.  But, after the shrink operation you should do a rebuild indexes, since the shrink operation will leave the database in a very fragmented state, which will reduce performance. Note that you need to rebuild indexes, reorganizing is not enough. For smaller amounts of data you should NOT shrink the database, since the data will be reused by the SQL server when it need to add more records.  In fact, it is regarded as a bad practice to shrink the database regularly.  So on a daily maintenance schedule you should NOT shrink the database. To shrink the database you do a DBCC SHRINKDATABASE command, and then follow up with a DBCC INDEXDEFRAG afterwards.  I find the easiest way to do this is to create a SQL Maintenance plan including the Shrink Database Task and the Rebuild Index Task and just execute it when you need to do this.

    Read the article

< Previous Page | 61 62 63 64 65 66  | Next Page >