Search Results

Search found 7318 results on 293 pages for 'team ferrari22'.

Page 56/293 | < Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >

  • Upcoming EMEA, APAC & US Events with MySQL in 2014

    - by Lenka Kasparova
    As an update to the previous announcement from Mar 25, 2014 please find below the updated list of events where MySQL Community team is attending and/or supporting. This time you can find not only EMEA & APAC ones but also conferences & events we are covering in the US & Canada. You are invited to meet our engineers at the events below.   EMEA  NEW!! BGOUG, Sandanski, Bulgaria, June 13, 2014  Georgi Kodinov will attend and speak at this local Oracle User Group event. Feel free to come. PHP Tour Lyon, Lyon, France, June 23-24, 2014 MySQL team is going to be part of this show as well, we are not going to have a booth here but very active networking by our french MySQL team around the event. Come to meet us and talk to us! NEW!! Converge Conference, Glasgow, Scotland, August 15-16, 2014  MySQL Community Manager, David Stokes attends with MySQL talk. NEW!! CakeFest, Madrid, Spain, August 21-24, 2014  A talk on "Scaling Your MySQL instances AND keeping your Sanity" will be given by the MySQL Community Manager, David Stokes. Froscon 2014, St.Augustin, Germany, August 23-24, 2014 Please visit our booth as well as watch the Froscon website for the schedule updates. NEW!! SymfonyLive, UK, London, September 25-26, 2014 MySQL Community Magers, David Stokes & Morgan Tocker submitted MySQL talks for this show. Schedule will be announced later on. DrupalCon Amsterdam, The Netherlands, September 29-Oct 3, 2014 Meet us at our booth at DrupalCon Amsterdam. For the schedule please watch the DrupalCon website. All Your Base, Oxford UK, October 17, 2014  Come to visit our MySQL booth and talk to our MySQL experts. NEW!! WebTechCon / IPC, Munich Germany, October 26-29, 2014 NEW!! DOAG, Nuremberg, Germany, November 18-20, 2014 There will be a full day of MySQL talks and one full day of MySQL workshop & sessions with live demo. This event is simply hard to miss! NEW!! Forum PHP Paris, France, November 21-22, 2014 More details: TBD NEW!! UK OUG, Liverpool, UK, December 8-10, 2014 MySQL will be part of the Oracle booth and we hope to get more space for MySQL talks.  USA NEW!! Texas Linux Fest, Austin, Texas, US, June 13-14, 2014 NEW!! SouthEast Linux Fest, Charlotte, US, June 20-22, 2014 NEW!! Debian Conference 2014, Portland, OR, US, August 23-31, 2014 NEW!! FossetCon, Orlando, US, September 11-13, 2014 NEW!! Oracle Open World, San Francisco, US, September 29-October 3, 2014 NEW!! MySQL Central @ Open/World, San Francisco, US, September 29-October 3, 2014 NEW!! PyTexas 2014, Dallas, TX, US, October 3-5, 2014 NEW!! All Things Open (replacing POSSCON), Raleigh, NC, October 23-24, 2014 NEW!! Ohio LinuxFest 2014, Columbus, Ohio, US, October 24-25, 2014 NEW!! ZendCon PHP, Santa Clara, US, October 27-30, 2014 NEW!! Kuali Days 2014, Indianapolis, US, November 10-13, 2014 NEW!! Live 360, Orlando, FL, US, November 17-20, 2014 APAC OpenSourceConference Japan, Hokkaido, June 13-14, 2014 MySQL is represented by Ryusuke Kajiyama with the talk on "MySQL Technology Updates". NEW!! db tech showcase, Osaka Japan, June 18-20, 2014 Three MySQL talks are scheduled for this show, "MySQL for Oracle DBA" & "MySQL Technology Updates" by Ryusuke Kajiyama. The last talk will be on MySQL Fabric by Yoshiaki Yamasaki. NEW!! PyCon Singapore, Singapore, June 18-20, 2014 Ryusuke Kajiyama will be talking about "Sharding and scale-out using Python-based MySQL Fabric". NEW!! COSCUP, Taipei, Taiwan, July 19-20, 2014 We are going to run a technical session on MySQL Workbench & one talk on how to make MySQL better MySQL. NEW!! PyCon New Zealand, Wellington, New Zealand, September 13-14, 2014 MySQL talks were submitted as well as one talk by Solaris Modernization team on Python & Solaris, watch the website for schedule updates. NEW!! PyCon Japan, Tokyo Japan, September 13-15, 2014 MySQL will be a MySQL session speaker, no schedule is announced yet. Ruby Kaigi, Tokyo, Japan, September 18-20, 2014 Another event MySQL supports and attends in APAC region. Ruby Kaigi is the international Ruby Conference in Japan, Tokyo. Ruby started in Japan, so Ruby Kaigi has excellent speakers and developers! MySQL team is going to be present at this conference with MySQL talks and active networking around the venue. NEW!! PyCon India, Bangalore, India, September 26-28, 2014 A MySQL talk on "MySQL Utilities scaling MySQL with Python" has been submitted, please watch the PyCon website for the schedule updates. NEW!! OpenSourceConference Japan, Tokyo, October 18-19, 2014 NEW!! OpenSource India, Bengaluru, India, November 7-8, 2014 NEW!! OpenSourceConference Japan, Fukuoka, November 14-15, 2014 You can check the MySQL wikis for updates on the conferences we are attending. Next time I hope to have more details for each event above (especially for the US ones).

    Read the article

  • Pella Increases Online Appointment Scheduling and Rapidly Personalizes and Updates Marketing Initiatives

    - by Michael Snow
    Originally posted on Oracle Customers page.Oracle Customer: Pella CorporationLocation:  Pella, IowaIndustry: Industrial Manufacturing Employees:  7,100 Pella Corporation is an innovative leader in creating a better view for homes and businesses by designing, testing, manufacturing, and installing quality windows and doors for new construction, remodeling, and replacement applications. A family-owned company, Pella has an 88-year history of innovation and, today, is the second-largest manufacturer in the country of windows and doors, including patio, entry, and storm doors. The company has 10 manufacturing facilities in United States and window and door showrooms across the United States and Canada. In-home consultations are an important part of Pella’s sales process. Several years ago, the company launched an online appointment scheduling tool to improve customer convenience. While the functionality worked well, the company wanted to increase online conversion rates and decrease the number of incomplete, online appointment schedules. It also wanted to give its business analysts and other line-of-business personnel the ability to update the scheduling tool and interface quickly, without needing IT team intervention and recoding, to better capitalize on opportunities and personalize the interface for specific markets. Pella also looked to reduce IT complexity by selecting a system that integrated easily with its Oracle E-Business Suite Release 12.1 enterprise applications.Pella, which has a large Oracle footprint, selected Oracle WebCenter Sites as the foundation for its new, real-time appointment scheduling application. It used the solution to re-engineer the scheduling process and the information required to set up an appointment. Just a few months after launch, it is seeing improvement in the number of appointments booked online and experiencing fewer abandoned appointments during the scheduling process. As important, Pella can now quickly and easily make changes to images, video, and content displayed on the scheduling tool interface, delivering greater business agility. Previously, such changes required a developer and weeks of coding and testing. Today, a member of Pella’s business analyst team can complete the changes in hours. This capability enables Pella to personalize the Web experience for customers. For example, it can display different products or images for clients in different regions.The solution is also highly scalable. Pella is using Oracle WebCenter Sites for appointment scheduling now and plans to migrate Pella.com, its configurator tool, and dealer microsites onto the platform. Further, Pella plans to leverage the solution to optimize mobile devices. “Moving ahead, we expect to extensively leverage Oracle WebCenter Sites to gain greater flexibility in updating the Web experience, thanks to the ability to make updates quickly without developer resources. Segmentation and targeting capabilities will allow us to create a more personalized experience across both traditional and mobile platforms,” said Teri Lancaster, IT manager, customer experience applications, Pella Corporation. A word from Pella Corporation "Oracle WebCenter Sites?from the start?delivered important benefits. We’ve redesigned the online scheduling process and are seeing more potential customers completing consultation bookings online. More important, the solution opens a world of other possibilities as we plan to migrate Pella.com and our dealer microsites to the platform, and leverage it to optimize the Web experience for our mobile devices.” – Teri Lancaster, IT Manager, Customer Experience Applications, Pella Corporation Oracle Product and Services Oracle WebCenter Sites Why Oracle Pella has a long-standing relationship with Oracle. “We look to Oracle first for a solution. Our Oracle account team came to us with several solutions, and Oracle WebCenter Sites delivered the scalability, ease-of-use, flexibility, and scalability that we required for the appointment scheduling initiative and other Web projects on the horizon, including migrating Pella.com and optimizing our site for mobile platforms,”said Teri Lancaster, IT manager, customer experience applications, Pella Corporation. Implementation Process The Pella implementation team, working with Oracle partner Element Solutions, LLC, integrated the appointment setting application with Pella.com as well as the company’s Oracle E-Business Suite customer relationship management applications. Using Oracle WebCenter Site’s development tools and subversion capabilities to develop the application, the Element Solutions and Pella teams could work remotely and collaboratively, accelerating deployment. Pella went live with the new scheduling tool in just six months. Partner Oracle PartnerElement Solutions, LLC Element Solutions was instrumental at every major stage of the project, including design creation and approval, development, training, and rollout. “Element Solutions was a vital partner for our Oracle WebCenter Sites initiative. The team provided guidance, and more important, critical knowledge transfer at every stage?which equipped us to get the most out of this powerful and versatile solution. We were definitely collaboration partners,” Lancaster said. Resources Pella Corporation Upgrades Enterprise Applications to Continue to Improve Manufacturing Efficiency Thousands of Customers Successfully and Smoothly Upgrade to Oracle E-Business Suite 12.1 for New Functionality, Lower Operating Costs and Improved Shared Operations Managing the Virtual World

    Read the article

  • SQL SERVER – SSMS: Top Object and Batch Execution Statistics Reports

    - by Pinal Dave
    The month of June till mid of July has been the fever of sports. First, it was Wimbledon Tennis and then the Soccer fever was all over. There is a huge number of fan followers and it is great to see the level at which people sometimes worship these sports. Being an Indian, I cannot forget to mention the India tour of England later part of July. Following these sports and as the events unfold to the finals, there are a number of ways the statisticians can slice and dice the numbers. Cue from soccer I can surely say there is a team performance against another team and then there is individual member fairs against a particular opponent. Such statistics give us a fair idea to how a team in the past or in the recent past has fared against each other, head-to-head stats during World cup and during other neutral venue games. All these statistics are just pointers. In reality, they don’t reflect the calibre of the current team because the individuals who performed in each of these games are totally different (Typical example being the Brazil Vs Germany semi-final match in FIFA 2014). So at times these numbers are misleading. It is worth investigating and get the next level information. Similar to these statistics, SQL Server Management studio is also equipped with a number of reports like a) Object Execution Statistics report and b) Batch Execution Statistics reports. As discussed in the example, the team scorecard is like the Batch Execution statistics and individual stats is like Object Level statistics. The analogy can be taken only this far, trust me there is no correlation between SQL Server functioning and playing sports – It is like I think about diet all the time except while I am eating. Performance – Batch Execution Statistics Let us view the first report which can be invoked from Server Node -> Reports -> Standard Reports -> Performance – Batch Execution Statistics. Most of the values that are displayed in this report come from the DMVs sys.dm_exec_query_stats and sys.dm_exec_sql_text(sql_handle). This report contains 3 distinctive sections as outline below.   Section 1: This is a graphical bar graph representation of Average CPU Time, Average Logical reads and Average Logical Writes for individual batches. The Batch numbers are indicative and the details of individual batch is available in section 3 (detailed below). Section 2: This represents a Pie chart of all the batches by Total CPU Time (%) and Total Logical IO (%) by batches. This graphical representation tells us which batch consumed the highest CPU and IO since the server started, provided plan is available in the cache. Section 3: This is the section where we can find the SQL statements associated with each of the batch Numbers. This also gives us the details of Average CPU / Average Logical Reads and Average Logical Writes in the system for the given batch with object details. Expanding the rows, I will also get the # Executions and # Plans Generated for each of the queries. Performance – Object Execution Statistics The second report worth a look is Object Execution statistics. This is a similar report as the previous but turned on its head by SQL Server Objects. The report has 3 areas to look as above. Section 1 gives the Average CPU, Average IO bar charts for specific objects. The section 2 is a graphical representation of Total CPU by objects and Total Logical IO by objects. The final section details the various objects in detail with the Avg. CPU, IO and other details which are self-explanatory. At a high-level both the reports are based on queries on two DMVs (sys.dm_exec_query_stats and sys.dm_exec_sql_text) and it builds values based on calculations using columns in them: SELECT * FROM    sys.dm_exec_query_stats s1 CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS s2 WHERE   s2.objectid IS NOT NULL AND DB_NAME(s2.dbid) IS NOT NULL ORDER BY  s1.sql_handle; This is one of the simplest form of reports and in future blogs we will look at more complex reports. I truly hope that these reports can give DBAs and developers a hint about what is the possible performance tuning area. As a closing point I must emphasize that all above reports pick up data from the plan cache. If a particular query has consumed a lot of resources earlier, but plan is not available in the cache, none of the above reports would show that bad query. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL Tagged: SQL Reports

    Read the article

  • SQL Server devs–what source control system do you use, if any? (answer and maybe win free stuff)

    - by jamiet
    Recently I noticed a tweet from notable SQL Server author and community dude-at-large Steve Jones in which he asked how many SQL Server developers were putting their SQL Server source code (i.e. DDL) under source control (I’m paraphrasing because I can’t remember the exact tweet and Twitter’s search functionality is useless). The question surprised me slightly as I thought a more pertinent question would be “how many SQL Server developers are not using source control?” because I have been doing just that for many years now and I simply assumed that use of source control is a given in this day and age. Then I started thinking about it. “Perhaps I’m wrong” I pondered, “perhaps the SQL Server folks that do use source control in their day-to-day jobs are in the minority”. So, dear reader, I’m interested to know a little bit more about your use of source control. Are you putting your SQL Server code into a source control system? If so, what source control server software (e.g. TFS, Git, SVN, Mercurial, SourceSafe, Perforce) are you using? What source control client software are you using (e.g. TFS Team Explorer, Tortoise, Red Gate SQL Source Control, Red Gate SQL Connect, Git Bash, etc…)? Why did you make those particular software choices? Any interesting anecdotes to share in regard to your use of source control and SQL Server? To encourage you to contribute I have five pairs of licenses for Red Gate SQL Source Control and Red Gate SQL Connect to give away to what I consider to be the five best replies (“best” is totally subjective of course but this is my blog so my decision is final ), if you want to be considered don’t forget to leave contact details; email address, Twitter handle or similar will do. To start you off and to perhaps get the brain cells whirring, here are my answers to the questions above: Are you putting your SQL Server code into a source control system? As I think I’ve already said…yes. Always. If so, what source control server software (e.g. TFS, Git, SVN, Mercurial, SourceSafe, Perforce) are you using? I move around a lot between many clients so it changes on a fairly regular basis; my current client uses Team Foundation Server (aka TFS) and as part of a separate project is trialing the use of Team Foundation Service. I have used SVN extensively in the past which I am a fan of (I generally prefer it to TFS) and am trying to get my head around Git by using it for ObjectStorageHelper. What source control client software are you using (e.g. TFS Team Explorer, Tortoise, Red Gate SQL Source Control, Red Gate SQL Connect, Git Bash, etc…)? On my current project, Team Explorer. In the past I have used Tortoise to connect to SVN. Why did you make those particular software choices? I generally use whatever the client uses and given that I work with SQL Server I find that the majority of my clients use TFS, I guess simply because they are Microsoft development shops. Any interesting anecdotes to share in regard to your use of source control and SQL Server? Not an anecdote as such but I am going to share some frustrations about TFS. In many ways TFS is a great product because it integrates many separate functions (source control, work item tracking, build agents) into one whole and I’m firmly of the opinion that that is a good thing if for no reason other than being able to associate your check-ins with a work-item. However, like many people there are aspects to TFS source control that annoy me day-in, day-out. Chief among them has to be the fact that it uses a file’s read-only property to determine if a file should be checked-out or not and, if it determines that it should, it will happily do that check-out on your behalf without you even asking it to. I didn’t realise how ridiculous this was until I first used SVN about three years ago – with SVN you make any changes you wish and then use your source control client to determine which files have changed and thus be checked-in; the notion of “check-out” doesn’t even exist. That sounds like a small thing but you don’t realise how liberating it is until you actually start working that way. Hoping to hear some more anecdotes and opinions in the comments. Remember….free software is up for grabs! @jamiet 

    Read the article

  • T-SQL Tuesday #21 - Crap!

    - by Most Valuable Yak (Rob Volk)
    Adam Machanic's (blog | twitter) ever popular T-SQL Tuesday series is being held on Wednesday this time, and the topic is… SHIT CRAP. No, not fecal material.  But crap code.  Crap SQL.  Crap ideas that you thought were good at the time, or were forced to do due (doo-doo?) to lack of time. The challenge for me is to look back on my SQL Server career and find something that WASN'T crap.  Well, there's a lot that wasn't, but for some reason I don't remember those that well.  So the additional challenge is to pick one particular turd that I really wish I hadn't squeezed out.  Let's see if this outline fits the bill: An ETL process on text files; That had to interface between SQL Server and an AS/400 system; That didn't use SSIS (should have) or BizTalk (ummm, no) but command-line scripting, using Unix utilities(!) via: xp_cmdshell; That had to email reports and financial data, some of it sensitive Yep, the stench smell is coming back to me now, as if it was yesterday… As to why SSIS and BizTalk were not options, basically I didn't know either of them well enough to get the job done (and I still don't).  I also had a strict deadline of 3 days, in addition to all the other responsibilities I had, so no time to learn them.  And seeing how screwed up the rest of the process was: Payment files from multiple vendors in multiple formats; Sent via FTP, PGP encrypted email, or some other wizardry; Manually opened/downloaded and saved to a particular set of folders (couldn't change this); Once processed, had to be placed BACK in the same folders with the original archived; x2 divisions that had to run separately; Plus an additional vendor file in another format on a completely different schedule; So that they could be MANUALLY uploaded into the AS/400 system (couldn't change this either, even if it was technically possible) I didn't feel so bad about the solution I came up with, which was naturally: Copy the payment files to the local SQL Server drives, using xp_cmdshell Run batch files (via xp_cmdshell) to parse the different formats using sed, a Unix utility (this was before Powershell) Use other Unix utilities (join, split, grep, wc) to process parsed files and generate metadata (size, date, checksum, line count) Run sqlcmd to execute a stored procedure that passed the parsed file names so it would bulk load the data to do a comparison bcp the compared data out to ANOTHER text file so that I could grep that data out of the original file Run another stored procedure to import the matched data into SQL Server so it could process the payments, including file metadata Process payment batches and log which division and vendor they belong to Email the payment details to the finance group (since it was too hard for them to run a web report with the same data…which they ran anyway to compare the emailed file against…which always matched, surprisingly) Email another report showing unmatched payments so they could manually void them…about 3 months afterward All in "Excel" format, using xp_sendmail (SQL 2000 system) Copy the unmatched data back to the original folder locations, making sure to match the file format exactly (if you've ever worked with ACH files, you'll understand why this sucked) If you're one of the 10 people who have read my blog before, you know that I love the DOS "for" command.  Like passionately.  Like fairy-tale love.  So my batch files were riddled with for loops, nested within other for loops, that called other batch files containing for loops.  I think there was one section that had 4 or 5 nested for commands.  It was wrong, disturbed, and completely un-maintainable by anyone, even myself.  Months, even a year, after I left the company I got calls from someone who had to make a minor change to it, and they called me to talk them out of spraying the office with an AK-47 after looking at this code.  (for you Star Trek TOS fans) The funniest part of this, well, one of the funniest, is that I made the deadline…sort of, I was only a day late…and the DAMN THING WORKED practically unchanged for 3 years.  Most of the problems came from the manual parts of the overall process, like forgetting to decrypt the files, or missing/late files, or saved to the wrong folders.  I'm definitely not trying to toot my own horn here, because this was truly one of the dumbest, crappiest solutions I ever came up with.  Fortunately as far as I know it's no longer in use and someone has written a proper replacement.  Today I would knuckle down and do it in SSIS or Powershell, even if it took me weeks to get it right. The real lesson from this crap code is to make things MAINTAINABLE and UNDERSTANDABLE.  sed scripting regular expressions doesn't fit that criteria in any way.  If you ever find yourself under pressure to do something fast at all costs, DON'T DO IT.  Stop and consider long-term maintainability, not just for yourself but for others on your team.  If you can't explain the basic approach in under 5 minutes, it ultimately won't succeed.  And while you may love to leave all that crap behind, it may follow you anyway, and you'll step in it again.   P.S. - if you're wondering about all the manual stuff that couldn't be changed, it was because the entire process had gone through Six Sigma, and was deemed the best possible way.  Phew!  Talk about stink!

    Read the article

  • Using SQL Source Control with Fortress or Vault &ndash; Part 1

    - by AjarnMark
    I am fanatical when it comes to managing the source code for my company.  Everything that we build (in source form) gets put into our source control management system.  And I’m not just talking about the UI and middle-tier code written in C# and ASP.NET, but also the back-end database stuff, which at times has been a pain.  We even script out our Scheduled Jobs and keep a copy of those under source control. The UI and middle-tier stuff has long been easy to manage as we mostly use Visual Studio which has integration with source control systems built in.  But the SQL code has been a little harder to deal with.  I have been doing this for many years, well before Microsoft came up with Data Dude, so I had already established a methodology that, while not as smooth as VS, nonetheless let me keep things well controlled, and allowed doing my database development in my tool of choice, Query Analyzer in days gone by, and now SQL Server Management Studio.  It just makes sense to me that if I’m going to do database development, let’s use the database tool set.  (Although, I have to admit I was pretty impressed with the demo of Juneau that Don Box did at the PASS Summit this year.)  So as I was saying, I had developed a methodology that worked well for us (and I’ll probably outline in a future post) but it could use some improvement. When Solutions and Projects were first introduced in SQL Management Studio, I thought we were finally going to get our same experience that we have in Visual Studio.  Well, let’s say I was underwhelmed by Version 1 in SQL 2005, and apparently so were enough other people that by the time SQL 2008 came out, Microsoft decided that Solutions and Projects would be deprecated and completely removed from a future version.  So much for that idea. Then I came across SQL Source Control from Red-Gate.  I have used several tools from Red-Gate in the past, including my favorites SQL Compare, SQL Prompt, and SQL Refactor.  SQL Prompt is worth its weight in gold, and the others are great, too.  Earlier this year, we upgraded from our earlier product bundles to the new Developer Bundle, and in the process added SQL Source Control to our collection.  I thought this might really be the golden ticket I was looking for.  But my hopes were quickly dashed when I discovered that it only integrated with Microsoft Team Foundation Server and Subversion as the source code repositories.  We have been using SourceGear’s Vault and Fortress products for years, and I wholeheartedly endorse them.  So I was out of luck for the time being, although there were a number of people voting for Vault/Fortress support on their feedback forum (as did I) so I had hope that maybe next year I could look at it again. But just a couple of weeks ago, I was pleasantly surprised to receive notice in my email that Red-Gate had an Early Access version of SQL Source Control that worked with Vault and Fortress, so I quickly downloaded it and have been putting it through its paces.  So far, I really like what I see, and I have been quite impressed with Red-Gate’s responsiveness when I have contacted them with any issues or concerns that I have had.  I have had several communications with Gyorgy Pocsi at Red-Gate and he has been immensely helpful and responsive. I must say that development with SQL Source Control is very different from what I have been used to.  This post is getting long enough, so I’ll save some of the details for a separate write-up, but the short story is that in my regular mode, it’s all about the script files.  Script files are King and you dare not make a change to the database other than by way of a script file, or you are in deep trouble.  With SQL Source Control, you make your changes to your development database however you like.  I still prefer writing most of my changes in T-SQL, but you can also use any of the GUI functionality of SSMS to make your changes, and SQL Source Control “manages” the script for you.  Basically, when you first link your database to source control, the tool generates scripts for every primary object (tables and their indexes are together in one script, not broken out into separate scripts like DB Projects do) and those scripts are checked into your source control.  So, if you needed to, you could still do a GET from your source control repository and build the database from scratch.  But for the day-to-day work, SQL Source Control uses the same technique as SQL Compare to determine what changes have been made to your development database and how to represent those in your repository scripts.  I think that once I retrain myself to just work in the database and quit worrying about having to find and open the right script file, that this will actually make us more efficient. And for deployment purposes, SQL Source Control integrates with the full SQL Compare utility to produce a synchronization script (or do a live sync).  This is similar in concept to Microsoft’s DACPAC, if you’re familiar with that. If you are not currently keeping your database development efforts under source control, definitely examine this tool.  If you already have a methodology that is working for you, then I still think this is worth a review and comparison to your current approach.  You may find it more efficient.  But remember that the version which integrates with Vault/Fortress is still in pre-release mode, so treat it with a little caution.  I have found it to be fairly stable, but there was one bug that I found which had inconvenient side-effects and could have really been frustrating if I had been running this on my normal active development machine.  However, I can verify that that bug has been fixed in a more recent build version (did I mention Red-Gate’s responsiveness?).

    Read the article

  • Keep it Professional &ndash; Multiple Environments

    - by AjarnMark
    I have certainly been reading blogs a whole lot more than writing them the last several weeks, and it’s about time I got back to writing.  I have been collecting several topics and references for blog posts…some of which will probably just never get written as the timeliness of the topics fade over time.  Nonetheless, I’m back, and I think it is time to revive my Doing Business Right series, this time coming from the slant of managing a development team rather than the previous angle of being self-employed.  First up: separating Dev, Test, and Prod. A few months ago, Colin Stasiuk (@BenchmarkIT) wrote a great post about separating your Dev, Test/UAT, and Prod environments.  This post covers all the important points such as removing Developer access from both PROD and UAT, and the importance of proper deployment (a.k.a. promotion) procedures.  I won’t repeat it all here, go read the original!  But what I do want to address is what I believe to be the #1 excuse people use for not having separate environments:  Money.  I discussed this briefly in my comment on Colin’s post at the time, but let me repeat it here and expand on it a bit. Don’t let the size of your company or the size of its budget dictate whether you do things professionally or not.  I am convinced that most developers and development teams would agree that it is a best practice to have separate environments for development, testing, and production (a.k.a. Live).  So why don’t they?  Because they think that it means separate servers which means more money.  While having separate physical servers for the different environments would be ideal, it is not an absolute requirement in order to make this work.  Here are a few ideas: Use multiple instances of SQL Server and multiple Web Sites with Headers or Ports.  For no additional fees* you can install multiple instances of SQL Server on the same machine.  This gives you a nice separation, allowing you to even use the same database names as will appear in PROD, yet isolating the data and security access.  And in IIS, you can create multiple Web Sites on the same server just by using Host Headers or different port numbers to separate them.  This approach does still pose the risk of non-Prod environments impacting performance on Prod, but when your application is busy enough for that to be a concern, you can probably afford one of the other options. Use desktop PCs instead of servers.  Instead of investing in full server-grade hardware, you can mimic the separate environments on old desktop PCs and at least get functional equivalency, if not performance matching.  The last I checked, Microsoft did not require separate licensing for SQL Server if that installation was used exclusively for dev or test purposes*.  There may be some version or performance differences between this approach and what you have in Prod, but you have isolated test from impacting Prod resources this way. Virtualization.  This is of course one of the hot topics of the day, and I would be remiss if I did not suggest this.  It is quite easy these days to setup virtual machines so that, again, your environments are fairly isolated from one another, and you retain all the security and procedural benefits of having separate environments. So the point is, keep your high professional standards intact.  You don’t need to compromise on using proper procedure just because you work in a small company with a small budget.  Keep doing things the right way! By the way, where I work, our DEV environment is not on a server.  All development is done on the developer’s individual workstation where it can be isolated from other developers’ work for the duration of writing the code, but also where the developers have to reconcile (merge) differences in code under concurrent development.  This usually means that each change is executed multiple times (once per developer to update their environments with the latest changes from others) giving us an extra, informal. test deployment before even going to the Test/UAT server.  It also means that if the network goes down, the developers can continue to hum along because they are not dependent on networked resources.  In fact, they will likely be even more productive because they aren’t being interrupted by email…but that’s another post I need to write. * I am not a lawyer, nor a licensing specialist, but it appeared to be so the last time I checked.  When in doubt, consult an expert on the topic.

    Read the article

  • A tale of two dev accounts

    - by TechTwaddle
    Note: I am currently in the process of relocating my blog from http://www.geekswithblogs.net/techtwaddle to my new address at http://www.techtwaddle.net I suggest you point your feed readers to the new address as I slowly transition to my new shared-hosted, ad-free wordpress blog :)   You probably remember my rant from a while back about my windows mobile developer account having problems with the new AppHub, well, there have been few developments and I thought I should share it with you. First up, the issue isn’t fixed yet. I still cannot login to AppHub using my windows mobile 6.x developer account and can’t view details of my Minesweeper app. Who knows how many copies its sold. I had numerous exchanges with Microsoft’s support team on the AppHub forums and via email as well (support ticket), but somehow we never managed to get to the root of it. In fact, the support team itself grew so tired of the problem that they suggested I create a new dev account. I grew impatient, and it was really frustrating to have an app ready for submission but not being able to do anything with it. Eventually, the frustration had to show somewhere, and it was on this forum thread Prabhu Kumar in reply to Nick Nick, I feel for you and totally understand the frustration. Since day one I have been getting the XBOX profile linking error, We encountered an issue connecting your App Hub account with your Xbox Live Profile. Please visit Xbox.com and update your contact information. After you have updated your contact information, please return to the App Hub (https://users.create.msdn.com/Register) to continue. I have an app published on the Windows Mobile 6.x marketplace since Aug, now I can't view the details of this app. I completed work on my WP7 application 1.5 months ago and the first version is ready for submission to marketplace, only if I can login. You can imagine how frustrating all this can be, the issue has taken far too long to be fixed, this has drained all my motivation. I have exchanged numerous mails with Microsoft support team on this issue, and from the looks of it they really are trying their best, unfortunately, their best is not good enough for some of us. During the first week of December I was told that there would be an update happening to AppHub around mid of December. I was hoping that the issue would be fixed but it wasn't. After the update the only change I notice is that the xbox.com link on the error page now takes me to the correct link. Previously, this link used to take me to the 404 page you mentioned above. Out of desperation, I am now considering creating another developer account on AppHub with a new live id, even this I am not 100% sure will work. I asked the support team when the next update to AppHub was planned and got this reply, "We do not have  release date to announce for the next App Hub update at this time. In regards to the login issue you are experiencing at this point the only solution would be to create a new account with a different live ID but make sure to go to xbox.com before hand to get all the information in order on that side." I know it's an extra $99, and not that I can't afford it but it doesn't feel right and I shouldn't have to be doing it in the first place. I have lost all hope of this issue being resolved. I went ahead and created a new dev account, the id verification was in progress when Shaun Taulbee of Microsoft, who has been really helpful in the forums, replied saying, If you find it necessary to pay again to create a new account due to a Microsoft problem, send in a support request asking for a refund and we'll review it (and likely approve it given the circumstances). The thought of refund made me happy, but I had my doubts. So once my second account was verified by Geotrust I applied for a refund through the developer dashboard, by creating a support ticket. Couple of days later I got an email from Microsoft saying that the refund had been approved! yay! Few days and the refund showed up on my bill, Well, thank you Microsoft, it means a lot. I am glad it’s over now. The new account works flawlessly. I would still like to get my first account working again and look at my app numbers for Win Mo 6.x, and probably transfer the credits to the new account somehow, but I’ll save it for another day. If you’ve had similar problems with the AppHub, and had to create a new account to submit your app, I suggest you contact the support team and get your dollars refunded!

    Read the article

  • Death March

    - by Nick Harrison
    It is a horrible sight to watch a project fail. There are few things as bad. Watching a project fail regardless of the reason is almost like sitting in a room with a "Dementor" from Harry Potter. It will literally suck all of the life and joy out of the room. Nearly every project that I have seen fail has failed because of political challenges or management challenges. Sometimes there are technical challenges that bring a project to its knees, but usually projects fail for less technical reasons. Here a few observations about projects failing for political reasons. Both the client and the consultants have to be committed to seeing the project succeed. Put simply, you cannot solve a problem when the primary stake holders do not truly want it solved. This could come from a consultant being more interested in extended the engagement. It could come from a client being afraid of what will happen to them once the problem is solved. It could come from disenfranchised stake holders. Sometimes a project is beset on all sides. When you find yourself working on a project that has this kind of threat, do all that you can to constrain the disruptive influences of the bad apples. If their influence cannot be constrained, you truly have no choice but to move on to a new project. Tough choices have to be made to make a project successful. These choices will affect everyone involved in the project. These choices may involve users not getting a change request through that they want. Developers may not get to use the tools that they want. Everyone may have to put in more hours that they originally planned. Steps may be skipped. Compromises will be made, but if everyone stays committed to the end goal, you can still be successful. If individuals start feeling disgruntled or resentful of the compromises reached, the project can easily be derailed. When everyone is not working towards a common goal, it is like driving with one foot on the break and one foot on the accelerator. Not only will you not get to where you are planning, you will also damage the car and possibly the passengers as well.   It is important to always keep the end result in mind. Regardless of the development methodology being followed, the end goal is not comprehensive documentation. In all cases, it is working software. Comprehensive documentation is nice but useless if the software doesn't work.   You can never get so distracted by the next goal that you fail to meet the current goal. Most projects are ultimately marathons. This means that the pace must be sustainable. Regardless of the temptations, you cannot burn the team alive. Processes will fail. Technology will get outdated. Requirements will change, but your people will adapt and learn and grow. If everyone on the team from the most senior analyst to the most junior recruit trusts and respects each other, there is no challenge that they cannot overcome. When everyone involved faces challenges with the attitude "This is my project and I will not let it fail" "You are my teammate and I will not let you fail", you will in fact not fail. When you find a team that embraces this attitude, protect it at all cost. Edward Yourdon wrote a book called Death March. In it, he included a graph for categorizing Death March project types based on the Happiness of the Team and the Chances of Success.   Chances are we have all worked on Death March projects. We will all most likely work on more Death March projects in the future. To a certain extent, they seem to be inevitable, but they should never be suicide or ugly. Ideally, they can all be "Mission Impossible" where everyone works hard, has fun, and knows that there is good chance that they will succeed. If you are ever lucky enough to work on such a project, you will know that sense of pride that comes from the eventual success. You will recognize a profound bond with the team that you worked with. Chances are it will change your life or at least your outlook on life. If you have not already read this book, get a copy and study it closely. It will help you survive and make the most out of your next Death March project.

    Read the article

  • Agile Executives

    - by Robert May
    Over the years, I have experienced many different styles of software development. In the early days, most of the development was Waterfall development. In the last few years, I’ve become an advocate of Scrum. As I talked about last month, many people have misconceptions about what Scrum really is. The reason why we do Scrum at Veracity is because of the difference it makes in the life of the team doing Scrum. Software is for people, and happy motivated people will build better software. However, not all executives understand Scrum and how to get the information from development teams that use Scrum. I think that these executives need a support system for managing Agile teams. Historical Software Management When Henry Ford pioneered the assembly line, I doubt he realized the impact he’d have on Management through the ages. Historically, management was about managing the process of building things. The people were just cogs in that process. Like all cogs, they were replaceable. Unfortunately, most of the software industry followed this same style of management. Many of today’s senior managers learned how to manage companies before software was a significant influence on how the company did business. Software development is a very creative process, but too many managers have treated it like an assembly line. Idea’s go in, working software comes out, and we just have to figure out how to make sure that the ideas going in are perfect, then the software will be perfect. Lean Manufacturing In the manufacturing industry, Lean manufacturing has revolutionized Henry Ford’s assembly line. Derived from the Toyota process, Lean places emphasis on always providing value for the customer. Anything the customer wouldn’t be willing to pay for is wasteful. Agile is based on similar principles. We’re building software for people, and anything that isn’t useful to them doesn’t add value. Waterfall development would have teams build reams and reams of documentation about how the software should work. Agile development dispenses with this work because excessive documentation doesn’t add value. Instead, teams focus on building documentation only when it truly adds value to the customer. Many other Agile principals are similar. Playing Catch-up Just like in the manufacturing industry, many managers in the software industry have yet to understand the value of the principles of Lean and Agile. They think they can wrap the uncertainties of software development up in a nice little package and then just execute, usually followed by failure. They spend a great deal of time and money trying to exactly predict the future. That expenditure of time and money doesn’t add value to the customer. Managers that understand that Agile know that there is a better way. They will instead focus on the priorities of the near term in detail, and leave the future to take care of itself. They have very detailed two week plans with less detailed quarterly plans. These plans are guided by a general corporate strategy that doesn’t focus on the exact implementation details. These managers also think in smaller features rather than large functionality. This adds a great deal of value to customers, since the features that matter most are the ones that the team focuses on in the near term and then are able to deliver to the customers that are paying for them. Agile managers also realize that stale software is very costly. They know that keeping the technology in their software current is much less expensive and risky than large rewrites that occur infrequently and schedule time in each release for refactoring of the existing software. Agile Executives Even though Agile is a better way, I’ve still seen failures using the Agile process. While some of these failures can be attributed to the team, most of them are caused by managers, not the team. Managers fail to understand what Agile is, how it works, and how to get the information that they need to make good business decisions. I think this is a shame. I’m very pleased that Veracity understands this problem and is trying to do something about it. Veracity is a key sponsor of Agile Executives. In fact, Galen is this year’s acting president for Agile Executives. The purpose of Agile Executives is to help managers better manage Agile teams and see better success. Agile Executives is trying to build a community of executives that range from managers interested in Agile to managers that have successfully adopted Agile. Together, these managers can form a community of support and ideas that will help make Agile teams more successful. Helping Your Team You can help too! Talk with your manager and get them involved in Agile Executives. Help Veracity build the community. If your manager understands Agile better, he’ll understand how to help his teams, which will result in software that adds more value for customers. If you have any questions about how you can be involved, please let me know. Technorati Tags: Agile,Agile Executives

    Read the article

  • SQL Server Developer Tools &ndash; Codename Juneau vs. Red-Gate SQL Source Control

    - by Ajarn Mark Caldwell
    So how do the new SQL Server Developer Tools (previously code-named Juneau) stack up against SQL Source Control?  Read on to find out. At the PASS Community Summit a couple of weeks ago, it was announced that the previously code-named Juneau software would be released under the name of SQL Server Developer Tools with the release of SQL Server 2012.  This replacement for Database Projects in Visual Studio (also known in a former life as Data Dude) has some great new features.  I won’t attempt to describe them all here, but I will applaud Microsoft for making major improvements.  One of my favorite changes is the way database elements are broken down.  Previously every little thing was in its own file.  For example, indexes were each in their own file.  I always hated that.  Now, SSDT uses a pattern similar to Red-Gate’s and puts the indexes and keys into the same file as the overall table definition. Of course there are really cool features to keep your database model in sync with the actual source scripts, and the rename refactoring feature is now touted as being more than just a search and replace, but rather a “semantic-aware” search and replace.  Funny, it reminds me of SQL Prompt’s Smart Rename feature.  But I’m not writing this just to criticize Microsoft and argue that they are late to the party with this feature set.  Instead, I do see it as a viable alternative for folks who want all of their source code to be version controlled, but there are a couple of key trade-offs that you need to know about when you choose which tool set to use. First, the basics Both tool sets integrate with a wide variety of source control systems including the most popular: Subversion, GIT, Vault, and Team Foundation Server.  Both tools have integrated functionality to produce objects to upgrade your target database when you are ready (DACPACs in SSDT, integration with SQL Compare for SQL Source Control).  If you regularly live in Visual Studio or the Business Intelligence Development Studio (BIDS) then SSDT will likely be comfortable for you.  Like BIDS, SSDT is a Visual Studio Project Type that comes with SQL Server, and if you don’t already have Visual Studio installed, it will install the shell for you.  If you already have Visual Studio 2010 installed, then it will just add this as an available project type.  On the other hand, if you regularly live in SQL Server Management Studio (SSMS) then you will really enjoy the SQL Source Control integration from within SSMS.  Both tool sets store their database model in script files.  In SSDT, these are on your file system like other source files; in SQL Source Control, these are stored in the folder structure in your source control system, and you can always GET them to your file system if you want to browse them directly. For me, the key differentiating factors are 1) a single, unified check-in, and 2) migration scripts.  How you value those two features will likely make your decision for you. Unified Check-In If you do a continuous-integration (CI) style of development that triggers an automated build with unit testing on every check-in of source code, and you use Visual Studio for the rest of your development, then you will want to really consider SSDT.  Because it is just another project in Visual Studio, it can be added to your existing Solution, and you can then do a complete, or unified single check-in of all changes whether they are application or database changes.  This is simply not possible with SQL Source Control because it is in a different development tool (SSMS instead of Visual Studio) and there is no way to do one unified check-in between the two.  You CAN do really fast back-to-back check-ins, but there is the possibility that the automated build that is triggered from the first check-in will cause your unit tests to fail and the CI tool to report that you broke the build.  Of course, the automated build that is triggered from the second check-in which contains the “other half” of your changes should pass and so the amount of time that the build was broken may be very, very short, but if that is very, very important to you, then SQL Source Control just won’t work; you’ll have to use SSDT. Refactoring and Migrations If you work on a mature system, or on a not-so-mature but also not-so-well-designed system, where you want to refactor the database schema as you go along, but you can’t have data suddenly disappearing from your target system, then you’ll probably want to go with SQL Source Control.  As I wrote previously, there are a number of changes which you can make to your database that the comparison tools (both from Microsoft and Red Gate) simply cannot handle without the possibility (or probability) of data loss.  Currently, SSDT only offers you the ability to inject PRE and POST custom deployment scripts.  There is no way to insert your own script in the middle to override the default behavior of the tool.  In version 3.0 of SQL Source Control (Early Access version now available) you have that ability to create your own custom migration script to take the place of the commands that the tool would have done, and ensure the preservation of your data.  Or, even if the default tool behavior would have worked, but you simply know a better way then you can take control and do things your way instead of theirs. You Decide In the environment I work in, our automated builds are not triggered off of check-ins, but off of the clock (currently once per night) and so there is no point at which the automated build and unit tests will be triggered without having both sides of the development effort already checked-in.  Therefore having a unified check-in, while handy, is not critical for us.  As for migration scripts, these are critically important to us.  We do a lot of new development on systems that have already been in production for years, and it is not uncommon for us to need to do a refactoring of the database.  Because of the maturity of the existing system, that often involves data migrations or other additional SQL tasks that the comparison tools just can’t detect on their own.  Therefore, the ability to create a custom migration script to override the tool’s default behavior is very important to us.  And so, you can see why we will continue to use Red Gate SQL Source Control for the foreseeable future.

    Read the article

  • LDAP Structure: dc=example,dc=com vs o=Example

    - by PAS
    I am relatively new to LDAP, and have seen two types of examples of how to set up your structure. One method is to have the base being: dc=example,dc=com while other examples have the base being o=Example. Continuing along, you can have a group looking like: dn: cn=team,ou=Group,dc=example,dc=com cn: team objectClass: posixGroup memberUid: user1 memberUid: user2 ... or using the "O" style: dn: cn=team, o=Example objectClass: posixGroup memberUid: user1 memberUid: user2 My questions are: Are there any best practices that dictate using one method over the other? Is it just a matter of preference which style you use? Are there any advantages to using one over the other? Is one method the old style, and one the new-and-improved version? So far, I have gone with the dc=example,dc=com style. Any advice the community could give on the matter would be greatly appreciated.

    Read the article

  • Use a MOO/MUD/MUSH for project collaboration?

    - by Clinton Blackmore
    Jeff Atwood recently posted about working with a team of programmers remotely. He spoke of pros and cons and of communicating with the team. One of the comments to his article says: Jeff, have you ever considered running a MOO for this? you can have any features you want to add to a MOO- mailing lists, tasks, and so on. All it takes is a moo server and learning moocode. Leetdoodsnonexistentramblings.blogspot.com on May 9, 2010 2:52 PM It is not clear to me how to contact the commenter (short of signing up for a social networking service I've never heard of), so I thought I'd ask here -- does anyone know what useful things you could do with a MOO (or MUD or MUSH) to promote collaboration on a team?

    Read the article

  • How do projects manage chef cookbooks when multiple teams manage multiple sets of cookbooks?

    - by strife25
    I am wondering how projects that have multiple component teams manage their sets of cookbooks? We are trying to figure out how we can have an ops team provide a set of "common component" cookbooks that can be re-used by other teams that will also write their own cookbooks. For example, the ops team should own the Java cookbook, while a component manages their cookbooks written for their component or build engines. From my little experience with chef server, this kind of workflow seems to not be well supported since the server stores and manages all cookbooks - so there is a potential to overwrite a cookbook written by another team. How do other projects deal with this type of problem?

    Read the article

  • Macro to manage sport ranking and calendar?

    - by Ale
    I need to write a macro to manage ranking and calendar for curling turnament. The event will follow the Shenkel system first match determined by general draw after that every team has played one match is possible to determine the first ranking second match determined by the rule: 1st vs. 2nd - 3rd vs. 4th - 5th vs. 6th and so on after that every team has played two matches is possible to determine the second ranking and so on until the end (3 to 5 matches normally). Another rule is that from the second match is not possible to play against a team that I played before! I was thinking to use MS-excel but also Calc (both LibreOffice/OpenOffice) should be fine. Thanks in advanced

    Read the article

  • null reference expction in the code

    - by LifeH2O
    I am getting NullReferenceException error on "_attr.Append(xmlNode.Attributes["name"]);". namespace SMAS { class Profiles { private XmlTextReader _profReader; private XmlDocument _profDoc; private const string Url = "http://localhost/teamprofiles.xml"; private const string XPath = "/teams/team-profile"; public XmlNodeList Teams{ get; private set; } private XmlAttributeCollection _attr; public ArrayList Team { get; private set; } public void GetTeams() { _profReader = new XmlTextReader(Url); _profDoc = new XmlDocument(); _profDoc.Load(_profReader); Teams = _profDoc.SelectNodes(XPath); foreach (XmlNode xmlNode in Teams) { _attr.Append(xmlNode.Attributes["name"]); } } } } the teamprofiles.xml file looks like <teams> <team-profile name="Australia"> <stats type="Test"> <span>1877-2010</span> <matches>721</matches> <won>339</won> <lost>186</lost> <tied>2</tied> <draw>194</draw> <percentage>47.01</percentage> </stats> <stats type="Twenty20"> <span>2005-2010</span> <matches>32</matches> <won>18</won> <lost>12</lost> <tied>1</tied> <draw>1</draw> <percentage>59.67</percentage> </stats> </team-profile> <team-profile name="Bangladesh"> <stats type="Test"> <span>2000-2010</span> <matches>66</matches> <won>3</won> <lost>57</lost> <tied>0</tied> <draw>6</draw> <percentage>4.54</percentage> </stats> </team-profile> </teams> I am trying to extract names of all teams in an ArrayList. Then i'll extract all stats of all teams and display them in my application. Can you please help me about that null reference exception?

    Read the article

  • null reference exception in the code

    - by LifeH2O
    I am getting NullReferenceException error on "_attr.Append(xmlNode.Attributes["name"]);". namespace SMAS { class Profiles { private XmlTextReader _profReader; private XmlDocument _profDoc; private const string Url = "http://localhost/teamprofiles.xml"; private const string XPath = "/teams/team-profile"; public XmlNodeList Teams{ get; private set; } private XmlAttributeCollection _attr; public ArrayList Team { get; private set; } public void GetTeams() { _profReader = new XmlTextReader(Url); _profDoc = new XmlDocument(); _profDoc.Load(_profReader); Teams = _profDoc.SelectNodes(XPath); foreach (XmlNode xmlNode in Teams) { _attr.Append(xmlNode.Attributes["name"]); } } } } the teamprofiles.xml file looks like <teams> <team-profile name="Australia"> <stats type="Test"> <span>1877-2010</span> <matches>721</matches> <won>339</won> <lost>186</lost> <tied>2</tied> <draw>194</draw> <percentage>47.01</percentage> </stats> <stats type="Twenty20"> <span>2005-2010</span> <matches>32</matches> <won>18</won> <lost>12</lost> <tied>1</tied> <draw>1</draw> <percentage>59.67</percentage> </stats> </team-profile> <team-profile name="Bangladesh"> <stats type="Test"> <span>2000-2010</span> <matches>66</matches> <won>3</won> <lost>57</lost> <tied>0</tied> <draw>6</draw> <percentage>4.54</percentage> </stats> </team-profile> </teams> I am trying to extract names of all teams in an ArrayList. Then i'll extract all stats of all teams and display them in my application. Can you please help me about that null reference exception?

    Read the article

  • Problem in arranging contents of Class in JAVA

    - by LuckySlevin
    Hi, I have some classes and I'm trying to fill the objects of this class. Here is what i've tried. (Question is at the below) public class Team { private String clubName; private String preName; private ArrayList<String> branches; public Team(String clubName, String preName) { this.clubName = clubName; this.preName = preName; branches = new ArrayList<String>(); } public Team() { // TODO Auto-generated constructor stub } public String getClubName() { return clubName; } public String getPreName() { return preName; } public ArrayList<String> getBranches() { return branches; } public void setClubName(String clubName) { this.clubName = clubName; } public void setPreName(String preName) { this.preName = preName; } public void setBranches(ArrayList<String> branches) { this.branches = branches; } } public class Branch { private ArrayList<Player> players = new ArrayList<Player>(); String brName; public Branch() {} public void setBr(String brName){this.brName = brName;} public String getBr(){return brName;} public ArrayList<Player> getPlayers() { return players; } public void setPlayers(ArrayList<Player> players) { this.players = players; } } //TEST CLASS public class test { /** * @param args * @throws IOException */ public static void main(String[] args) throws IOException { String a,b,c; String q = "q"; int brCount = 0, tCount = 0; BufferedReader input = new BufferedReader(new InputStreamReader(System.in)); Team[] teams = new Team[30]; Branch[] myBranch = new Branch[30]; for(int z = 0 ; z <30 ;z++) { teams[z] = new Team(); myBranch[z] = new Branch(); } ArrayList<String> tmp = new ArrayList<String>(); int k = 0; int secim = Integer.parseInt(input.readLine()); while(secim != 0) { if(k!=0) secim = Integer.parseInt(input.readLine()); k++; switch(secim) { case 1 : brCount = 0; a = input.readLine(); teams[tCount].setClubName(a); b= input.readLine(); teams[tCount].setPreName(b); c = input.readLine(); while(c.equals(q) == false) { if(brCount != 0) {c = input.readLine();} if(c.equals(q)== false){ myBranch[brCount].brName = c; tmp.add(myBranch[brCount].brName); brCount++; } System.out.println(brCount); } teams[tCount].setBranches(tmp); for(int i=0;i<=tCount;i++ ){ System.out.print("a :" + teams[i].getClubName()+ " " + teams[i].getPreName()+ " "); System.out.println(teams[i].getBranches());} tCount++; break; case 2: String src = input.readLine();//LATERRRRRRRr } } } } The problem is one of my class elements. I have an arraylist as an element of a class. When i enter: AAA as preName BBB as clubName c d e as Branches Then as a second element www as preName GGG as clubName a b as branches The result is coming like: AAA BBB c,d,e,a,b GGG www c,d,e,a,b Which means ArrayList part of the class is putting it on and on. I tried to use clear() method but caused problems. Any ideas.

    Read the article

  • Windows 8 / IIS 8 Concurrent Requests Limit

    - by OWScott
    IIS 8 on Windows Server 2012 doesn’t have any fixed concurrent request limit, apart from whatever limit would be reached when resources are maxed. However, the client version of IIS 8, which is on Windows 8, does have a concurrent connection request limitation to limit high traffic production uses on a client edition of Windows. Starting with IIS 7 (Windows Vista), the behavior changed from previous versions.  In previous client versions of IIS, excess requests would throw a 403.9 error message (Access Forbidden: Too many users are connected.).  Instead, Windows Vista, 7 and 8 queue excessive requests so that they will be handled gracefully, although there is a maximum number of requests that will be processed simultaneously. Thomas Deml provided a concurrent request chart for Windows Vista many years ago, but I have been unable to find an equivalent chart for Windows 8 so I asked Wade Hilmo from the IIS team what the limits are.  Since this is controlled not by the IIS team itself but rather from the Windows licensing team, he asked around and found the authoritative answer, which I’ll provide below. Windows 8 – IIS 8 Concurrent Requests Limit Windows 8 3 Windows 8 Professional 10 Windows RT N/A since IIS does not run on Windows RT Windows 7 – IIS 7.5 Concurrent Requests Limit Windows 7 Home Starter 1 Windows 7 Basic 1 Windows 7 Premium 3 Windows 7 Ultimate, Professional, Enterprise 10 Windows Vista – IIS 7 Concurrent Requests Limit Windows Vista Home Basic (IIS process activation and HTTP processing only) 3 Windows Vista Home Premium 3 Windows Vista Ultimate, Professional 10 Windows Server 2003, Windows Server 2008, Windows Server 2008 R2 and Windows Server 2012 allow an unlimited amount of simultaneously requests.

    Read the article

  • Microsoft .NET Web Programming: Web Sites versus Web Applications

    - by SAMIR BHOGAYTA
    In .NET 2.0, Microsoft introduced the Web Site. This was the default way to create a web Project in Visual Studio 2005. In Visual Studio 2008, the Web Application has been restored as the default web Project in Visual Studio/.NET 3.x The Web Site is a file/folder based Project structure. It is designed such that pages are not compiled until they are requested ("on demand"). The advantages to the Web Site are: 1) It is designed to accommodate non-.NET Applications 2) Deployment is as simple as copying files to the target server 3) Any portion of the Web Site can be updated without requiring recompilation of the entire Site. The Web Application is a .dll-based Project structure. ASP.NET pages and supporting files are compiled into assemblies that are then deployed to the target server. Advantages of the Web Application are: 1) Precompiled files do not expose code to an attacker 2) Precompiled files run faster because they are binary data (the Microsoft Intermediate Language, or MSIL) executed by the CLR (Common Language Runtime) 3) References, assemblies, and other project dependencies are built in to the compiled site and automatically managed. They do not need to be manually deployed and/or registered in the Global Assembly Cache: deployment does this for you If you are planning on using automated build and deployment, such as the Team Foundation Server Team Build engine, you will need to have your code in the form of a Web Application. If you have a Web Site, it will not properly compile as a Web Application would. However, all is not lost: it is possible to work around the issue by adding a Web Deployment Project to your Solution and then: a) configuring the Web Deployment Project to precompile your code; and b) configuring your Team Build definition to use the Web Deployment Project as its source for compilation. https://msevents.microsoft.com/cui/WebCastEventDetails.aspx?culture=en-US&EventID=1032380764&CountryCode=US

    Read the article

  • MVP Summit 2011 summary and thoughts: The &ldquo;I hope I don&rsquo;t cross a line and lose my MVP status&rdquo; post

    - by George Clingerman
    I've been wanting to write this post summarizing my thoughts about the MVP summit but have been dragging my feet since it's a very difficult one to write. However seeing Andy (http://forums.create.msdn.com/forums/t/77625.aspx) and Catalin (http://www.catalinzima.com/2011/03/mvp-summit-2011/) and Chris (http://geekswithblogs.net/cwilliams/archive/2011/03/07/144229.aspx) post about it has encouraged me to finally take the plunge. I'm going to have to write carefully though because I'm going to be dancing around a ton of NDA mine fields as well as having to walk the tight-rope of not sending the wrong message or having people read too much into what I'm saying. I want to note that most of what I'm about to say is just based on my observations, they're not thoughts that Microsoft has asked me to pass along and they're not things I heard Microsoft say. It's just me sharing what I think after going to the MVP summit. Let's start off with a short imaginary question and answer session.     Has the App Hub forums and XBLIG management been rather poor by Microsoft? Yes.     Do I think we're going to see changes to that overnight? No.     Will it continue to look bad from the outside? Somewhat. Confusing right? Well that's kind of how things are right now. Lots of confusion. XNA is doing AWESOME. Like, really, really awesome. As a result of that awesomeness, XNA is on three major platforms: Xbox 360, WP7 and PC. This means that internally Microsoft is really excited and invested in the technology. That's fantastic for XNA and really should show you the future the framework has. It's here to stay. So why are Xbox LIVE Indie Game developers feeling so much pain? The ironic thing is that pain is being caused by the success of XNA. When XNA was just a small thing, there was more freedom and more focus. It was just us and them. We were an only child. Now our family has grown and everyone has and wants some time with XNA. This gets XNA pulled in all directions and as it moves onto new platforms, it plays catch up trying to get those platforms up to speed to where Xbox LIVE Indie Games has grown. Forums, documentation, educational content. They all need to be there because Xbox LIVE Indie Games has all of that and more. Along with the catch up in features/documentation/awesomeness there's the catch up that the people on the team have to play. New platforms and new areas of development mean new players and those new guys don't have the history of being around from the beginning. This leads to a lack of understanding at times just how important some things are because they seem so small and insignificant (Rich Text defaulting for new forum profiles would be one things that jumps to mind). If you're not aware that the forums have become more than just a basic Q&A, if you're not aware that they're a central hub to a very active community, then you don't understand why that small change should be prioritized over something else. New people have to get caught up and figure out how to make a framework and central forum site work for everyone it's now serving. So yeah, a lot of our pain this last year has been simply that XNA is doing well and XBLIG is doing well so the focus was shifted to catch other things up. It hurts when a parent seems to not have any time for you and they're spending some much time with your new baby brother. Growing pains. All families and in our case our product family experience it to some degree. I think as WP7 matures we'll see the team figuring out how to give everyone the right amount of attention. While we're talking about some of our growing pains, it is also important to note (although not really an excuse) that the Xbox LIVE Arcade developers complain about many of the same things that we do. If you paid attention to talks and information coming out of GDC 2011, most of the the XBLA guys were saying things that sounded eerily similar to what the XBLIG developers are saying (Scott Nichols from GayGamer.net noticed http://twitter.com/#!/NaviFairyGG/status/43540379206811650). Does this mean we should just accept the status quo since we're being treated exactly the same? No way. However it DOES show that the way we're being treated is no indication of the stability and future of the platform, it's just Microsoft dropping the communication ball on two playing fields. We're not alone and we're not even being treated worse. Not great, but also in a weird way a very good sign. Now on to a few tidbits I think I CAN share from the summit (I'm really crossing my fingers I'm not stepping over some NDA line I shouldn't be). First, I discovered that the XBLIG user base is bigger than I personally had originally estimated. I won't give the exact numbers (although we did beg Microsoft to release some of these numbers so maybe someday?) but it was much larger than my original guestimates and I was pleasantly surprised. Maybe some of you guys had the right number when you were guessing, but I know that mine was much too low. And even MORE importantly the number of users/shoppers is growing at a steady pace as well. Our market is growing! That was fantastic news and really something that I had to share. On to the community manager discussion. It was mentioned. I was mentioned. I blushed. Nothing more to report there than the blush in my cheeks was a light crimson color. If I ever see a job description posted for that position I have a resume waiting in the wings. I can't deny that I think that would be my dream job... ...so after I finished blushing, the MVPs did make it very, very clear that the communication has to improve. Community manager or not the single biggest pain point with the Xbox LIVE Indie Game community has been a lack of communication. I have seen dramatic improvement in the team responding to MVPs and I'm even seeing more communication from them on the forums so I'm hoping that's a long term change. I really think they understood the issue, the problem remains how to open that communication channel in a way that was sustainable. I think they'll get it figured out and hopefully that's sooner rather than later. During the summit, you may have seen me tweeting about how I was "that guy" (http://twitter.com/#!/clingermangw/status/42740432471470081). You also may have noticed that Andy and Catalin both mentioned me in their summit write ups. I may have come on a bit strong while I was there...went a little out of character for myself. I've been agitated for a while with the way things have been and I've been listening to you guys and hearing you guys be agitated. I'm also watching some really awesome indie game developers looking elsewhere and leaving the platform. Some of them we might not have been able to keep even with changes, but others are only leaving because of perceptions and lack of communication from Microsoft. And that pisses me off. And I let Microsoft know that I was pissed off. You made your list and I took that list and verbalized it. I verbalized the hell out of it. [It was actually mentioned that I'm a lot nicer on the forums and in email than I am in person...I felt bad about that, but I couldn't stay silent]. Hopefully it did something guys, I really did try hard to get the message across. Along with my agitation, I also brought some pride. I mentioned several things in person to the team that I was particularly proud of. From people in the community that are doing an awesome job, to the re-launch of XboxIndies that was going on that week and even gamers like Steven Hurdle (http://writingsofmassdeduction.com/) who have purchased one XBLIG every day for over 100 days now. The community is freaking rocking it and I made sure to highlight that. So in conclusion, I'd just like to say hang in there (you know, like that picture of the cat). If you've been worried about investing in Xbox LIVE Indie Games because you think it's on shaky ground. It's not. Dream Build Play being about the Xbox 360 should have helped a little to point that out. The team is really scrambling around trying to figure things out and make improvements all around. There’s quite a few new gals and guys and it's going to take them time to catch up and there are a lot of constantly shifting priorities. We all have one toy, one team and we're fighting for time with it. It's also time for the community to continue spreading our wings and going out on our own more often. The Indie Game Winter Uprising was a fantastic example of that. We took things into our own hands and it got noticed and Microsoft got behind it. They do every time we stand up and do something (look at how many Microsoft employees tweeted, wrote about the re-launch of XboxIndies.com or the support I've gotten from them for my weekly XNA Notes). XNA is here to stay, it's time for us to stop being scared of that and figure out how to make our own games the successes they should be. There's definitely a list of things that need to be fixed, things that should be improved and I think we should definitely keep vocal about that with Microsoft. Keep it short, focused and prioritized. There's also a lot of things we can do ourselves while we're waiting on them to fix and change things. Lots of ways we can compensate for particular weaknesses in the channel. The kind of stuff that we can step up and do ourselves. Do it on our own, you know, the way Indies always do. And I'm really looking forward to watching us do just that.

    Read the article

  • Should Development / Testing / QA / Staging environments be similar?

    - by Walter White
    Hi all, After much time and effort, we're finally using maven to manage our application lifecycle for development. We still unfortunately use ANT to build an EAR before deploying to Test / QA / Staging. My question is, while we made that leap forward, developers are still free to do as they please for testing their code. One issue that we have is half our team is using Tomcat to test on and the other half is using Jetty. I prefer Jetty slightly over Tomcat, but regardless we using WAS for all the other environments. My question is, should we develop on the same application server we're deploying to? We've had numerous bugs come up from these differences in environments. Tomcat, Jetty, and WAS are different under the hood. My opinion is that we all should develop on what we're deploying to production with so we don't have the problem of well, it worked fine on my machine. While I prefer Jetty, I just assume we all work on the same environment even if it means deploying to WAS which is slow and cumbersome. What are your team dynamics like? Our lead developers stepped down from the team and development has been a free for all since then. Walter

    Read the article

  • Best of “The Moth” 2013

    - by Daniel Moth
    As previously (2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012) the time has come again to look back over the year’s activities on this blog, and as predicted there were 3 themes 1. It has been just 15 months since I changed role from what at Microsoft we call an “Individual Contributor” (IC) to a managerial role where ICs report to me. Part of being a manager entails sharing career tips with your team and some of those I have put up on my blog over the last year (and hope to continue to next year): Effectiveness and Efficiency, Lead, Follow, or Get out of the way, and Perfect is the enemy of “Good Enough”. 2. It has also been a 15 months that I joined the Visual Studio Diagnostics team, and we have shipped many capabilities in Visual Studio 2013. I helped the members of my team blog about every single one and create videos of many, and then I created a table of contents pointing to all of their blog posts, so if you are interested in what I have been working on over the last year please follow the links from the master blog post here: Visual Studio 2013 Diagnostics Investments. We are busy working on future Visual Studio releases/updates and I will link to those when we are ready… 3. Finally, I used some of my free time (which is becoming eve so scarce) to do some device development and as part of that I shared a few thoughts and code: Debug.Assert replacement for Phone and Store apps, asynchrony is viral, and MyMessageBox for Phone and Store apps. To see what 2014 will bring to this blog, please subscribe using the link on the left… Happy New Year! Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Kids don’t mark their own homework

    - by jamiet
    During a discussion at work today in regard to doing some thorough acceptance testing of the system that I currently work on the topic of who should actually do the testing came up. I remarked that I didn’t think that I as the developer should be doing acceptance testing and a colleague, Russ Taylor, agreed with me and then came out with this little pearler: Kids don’t mark their own homework Maybe its a common turn of phrase but I had never heard it before and, to me, it sums up very succinctly my feelings on the matter. I tweeted about it and it got a couple of retweets as well as a slightly different perspective from Bruce Durling who said: I'm of the opinion that testers should be in the dev team & the dev *team* should be responsible for quality Bruce makes a good point that testers should be considered part of the dev team. I agree wholly with that and don’t think that point of view necessarily conflicts with Russ’s analogy. Yes, developers should absolutely be responsible for testing their own work – I also think that in the murky world of data integration there is often a need for a 3rd party to validate that work. Improving testing mechanisms for data integration projects is something that is near and dear to my heart so I would welcome any other thoughts around this. Let me know if you have any in the comments! @Jamiet

    Read the article

  • Resources for the &ldquo;What&rsquo;s New in VS 2013&rdquo; Presentation

    - by John Alexander
    Originally posted on: http://geekswithblogs.net/jalexander/archive/2013/10/24/resources-for-the-ldquowhatrsquos-new-in-vs-2013rdquo-presentation.aspxThanks for attending the “What’s New in Visual Studio 2013 (and TFS too) presentation. As promised, here are some links! Note: if you didn’t attend, its ok. This is for you, too. The bits themselves.  This article introduces new and enhanced features in Visual Studio 2013 Visual Studio Virtual Launch – Lots of Videos here and and then on November 13th, live sessions and a q and a session… What features map to what Visual Studio editions Visual Studio 2013 New Editor Features Visual Studio 2013 Application Lifecycle Management Virtual Machine and Hands-on-Labs / Demo Scripts from Brian Keller More on CodeLens from Zain Naboulsi  What are Web Essentials? You can now download Web Essentials for Visual Studio 2013 RTM. A great overview on TFS 2013 from Brian Harry The release archive lists updates made to Team Foundation Service along with which version of Team Foundation Server the updates are a part of. REST API for Team Rooms  “What's new in Visual Studio for Web Developers and Front End Devs” screencasts – quick, easy and painless from the always awesome Scott Hanselman Introducing ASP.NET Identity –-- A membership system for ASP.NET applications Visual Studio 2013 Adds New Project Templates with Improvements and Social Accounts Authentication

    Read the article

< Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >