Search Results

Search found 11268 results on 451 pages for 'shweta simply'.

Page 62/451 | < Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >

  • What real life bad habits has programming given you?

    - by Jacob T. Nielsen
    Programming has given me a lot of bad habits and it continues to give me more everyday. But I have also gotten some bad habits from the mindset that I have put myself in. There simply are some things that are deeply rooted in my nature, though some of them I wish I could get rid of. A few: Looking for polymorphism, inheritance and patterns in all of God's creations. Explaining the size of something in pixels and colors in hex code. Using code related abstract terms in everyday conversations. How have you been damaged?

    Read the article

  • Ubuntu 12.10 is reading mouse battery as a laptop battery

    - by Ross Fleming
    I have a bluetooth mouse connected to my laptop. Much to my delight (I posted the suggestion!) the mouses battery is now displayed along side the laptop's battery. Unfortunately is is named simply "battery" in the drop down and "Laptop battery" once the power statistics windows is opened. The details of this battery correctly say that the model is a Lenovo Bluetooth Mouse, but it is still displayed as a "battery" and "laptop battery". Is there anyway to change this? Edit: Plus the "Show time in menu bar" shows the mouse's remaining battery percentage as opposed to the laptop battery's remaining time.

    Read the article

  • SQL Server 2012 content on Channel 9

    - by jamiet
    A mountain of SQL Server 2012 video content featuring Greg Low, Jonathan Kehayias, Joe Sack and Roger Doherty has just been released on Channel 9. Channel 9 has great support for tags and RSS feeds so if you want to automatically download all of that content simply you can add the following RSS feed: http://channel9.msdn.com/Tags/sql+server+2012/RSS to your podcast reader of choice and have fun learning about all the new features in SQL Server 2012 such as: AlwaysOn Power View SSDT SSRS Data Alerts SSAS Tabular Modelling DAX Improvements MDS improvements SSIS improvements DQS StreamInsight improvements Data-Tier Apps (DACs) LocalDB FileTable Spatial improvements T-SQL paging Distributed Replay XEvents improvements ADO.Net Code-first T-SQL improvements Server roles Partitioning improvements ColumnStore Whew, quite a list! @jamiet

    Read the article

  • Script to setup Ubuntu as a wireless access point on a bridge mode

    - by nixnotwin
    I use the following script to make my netbook a full-fledged wireless access point. It creates a bridge with eth0 and wlan0 and starts hostapd. #!/bin/bash service network-manager stop ifconfig eth0 0.0.0.0 #remove IP from eth0 ifconfig eth0 up #ensure the interface is up ifconfig wlan0 0.0.0.0 #remove IP from eth1 ifconfig wlan0 up #ensure the interface is up brctl addbr br0 #create br0 node hostapd -d /etc/hostapd/hostapd.conf > /var/log/hostapd.log & sleep 5 brctl addif br0 eth0 #add eth0 to bridge br0 brctl addif br0 wlan0 #add wlan0 to bridge br0 ifconfig br0 192.168.1.15 netmask 255.255.255.0 #ip for bridge ifconfig br0 up #bring up interface route add default gw 192.168.1.1 # gateway This script works efficiently. But if I want to revert back to use Network Manager, I cannot do it. The bridge simply cannot be deleted. How can I modify this script so that if I run bridge_script --stop, the bridge gets deleted, network manager starts and interfaces behave as if the machine had a fresh reboot.

    Read the article

  • How to Automatically Backup Your Gmail Attachments With IFTTT

    - by Mark Wilson
    When it comes to getting things done quickly, automation is the name of the game. We’ve looked at IFTTT before, and a new batch of updates has introduced a number of options that can be used to automatically do things with files that are sent to your Gmail address. What could this be used for? Well the most obvious starting point is to simply create a backup of any files that you receive via email. This is useful if you find that you often reach the size limit for your inbox as it enables you to delete emails without having to worry about losing the associated files. Start by paying a visit to the IFTTT website and then either sign into an existing account or create a new one.     

    Read the article

  • What do DBAs do?

    - by Jonathan Conway
    Yes, I know they administrate databases. I asked this question because I'd like to get a further insight into the kind of day-to-day duties a DBA might perform, and the real-world business problems they solve. For example: I optimized a 'products' query so that it ran 25% faster, which made the overall application faster. Is this a typical duty? Or is there more to being a DBA than simply making things faster? In what situations does DBA work involve planning and creativity?

    Read the article

  • How can architects work with self-organizing Scrum teams?

    - by Martin Wickman
    An organization with a number of agile Scrum teams also has a small group of people appointed as "enterprise architects". The EA group acts as control and gatekeeper for quality and adherence to decisions. This leads to overlaps between the team decision and EA decisions. For instance, the team might want to use library X or want to use REST instead of SOAP, but the EA does not approve of that. Now, this can lead to frustration when team decisions are overruled. Taken far enough, it can potentially lead to a situation where the EA people "grabs" all power and the team ends up feeling demotivated and not very agile at all. The Scrum guides has this to say about it: Self-organizing: No one (not even the Scrum Master) tells the Development Team how to turn Product Backlog into Increments of potentially releasable functionality. Is that reasonable? Should the EA team be disbanded? Should the teams refuse or simply comply?

    Read the article

  • Domain forwarding with url substitution in the address bar

    - by Mario Duarte
    Hello, I have a blog being served by a machine I have at home. Since the ip can change i set up a dyndns domain to always point to that machine. However, I purchased a more friendly domain (at godaddy.com) and I would like to forward it to that blog. The problem is that if I simply forward it the users will see the dyndns domain in the address bar and could potentially bookmark those urls and that's a problem. I noticed that godaddy.com has domain masking and although it does hide the dyndns domain in the address bar, it also keeps the same root address in the address bar even if I navigate to another page. I also have the feeling that search engines will not like this domain masking thing. Does anyone know how can I accomplish what I want?

    Read the article

  • Farseer Physics Engine and the Ms-PL License

    - by Stephen Tierney
    Am I able to produce code for a game which uses the Farseer engine and release my code under an open source license other than the Ms-PL? My concern is with the following section from the license: If you distribute any portion of the software in source code form, you may do so only under this license by including a complete copy of this license with your distribution. If you distribute any portion of the software in compiled or object code form, you may only do so under a license that complies with this license. If I do not include Farseer in my source code distribution does this give me an exemption from this clause as I am not distributing the software? My code merely uses its functions. No where in the license does it force you to provide source code for derivative works or linking works, it simply gives you the option of "if you distribute".

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you'll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you've read my previous blog posts, you'll be aware that I've been focusing on the database continuous integration theme. In my CI setup I create a "production"-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it's not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn't I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn't an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley's "Continuous Delivery" teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you've been allotted. 2. It's not just about the storage requirements, it's also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I'm just not going to get the feedback quickly enough to react. So what's the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I'm sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server's point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no 'duplicate' storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly "release test" process triggered by my CI tool. RESTORE DATABASE WidgetProduction_virtual FROM DISK=N'C:\WidgetWF\ProdBackup\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE mydatabase WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the 'virtual' restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • Stay Tuned for Relaunch

    - by Shyam Bajaj
    In the coming days, the Oracle Health Sciences team will be relaunching Health Sciences Connect. Stay tuned! In the meantime, interact with us directly via:  Twitter - Follow and converse with Oracle Health Sciences leaders - simply Tweet to us by adding @OracleHealthSci before your question or comment.  Facebook - Stay in the know with industry thought leadership pieces from Oracle Health Sciences  YouTube - Watch interviews with heads of Oracle Health Sciences and industry leaders  RSS Feed - Subscribe to us from your browser or RSS reader for industry and company updates   For updated Oracle Health Sciences product and organization information, please visit us at www.oracle.com/healthsciences.

    Read the article

  • Yet another blog about IValueConverter

    - by codingbloke
    After my previous blog on a Generic Boolean Value Converter I thought I might as well blog up another IValueConverter implementation that I use. The Generic Boolean Value Converter effectively converters an input which only has two possible values to one of two corresponding objects.  The next logical step would be to create a similar converter that can take an input which has multiple (but finite and discrete) values to one of multiple corresponding objects.  To put it more simply a Generic Enum Value Converter. Now we already have a tool that can help us in this area, the ResourceDictionary.  A simple IValueConverter implementation around it would create a StringToObjectConverter like so:- StringToObjectConverter using System; using System.Windows; using System.Windows.Data; using System.Linq; using System.Windows.Markup; namespace SilverlightApplication1 {     [ContentProperty("Items")]     public class StringToObjectConverter : IValueConverter     {         public ResourceDictionary Items { get; set; }         public string DefaultKey { get; set; }                  public StringToObjectConverter()         {             DefaultKey = "__default__";         }         public virtual object Convert(object value, Type targetType, object parameter, System.Globalization.CultureInfo culture)         {             if (value != null && Items.Contains(value.ToString()))                 return Items[value.ToString()];             else                 return Items[DefaultKey];         }         public virtual object ConvertBack(object value, Type targetType, object parameter, System.Globalization.CultureInfo culture)         {             return Items.FirstOrDefault(kvp => value.Equals(kvp.Value)).Key;         }     } } There are some things to note here.  The bulk of managing the relationship between an object instance and the related string key is handled by the Items property being an ResourceDictionary.  Also there is a catch all “__default__” key value which allows for only a subset of the possible input values to mapped to an object with the rest falling through to the default. We can then set one of these up in Xaml:-             <local:StringToObjectConverter x:Key="StatusToBrush">                 <ResourceDictionary>                     <SolidColorBrush Color="Red" x:Key="Overdue" />                     <SolidColorBrush Color="Orange" x:Key="Urgent" />                     <SolidColorBrush Color="Silver" x:Key="__default__" />                 </ResourceDictionary>             </local:StringToObjectConverter> You could well imagine that in the model being bound these key names would actually be members of an enum.  This still works due to the use of ToString in the Convert method.  Hence the only requirement for the incoming object is that it has a ToString implementation which generates a sensible string instead of simply the type name. I can’t imagine right now a scenario where this converter would be used in a TwoWay binding but there is no reason why it can’t.  I prefer to avoid leaving the ConvertBack throwing an exception if that can be be avoided.  Hence it just enumerates the KeyValuePair entries to find a value that matches and returns the key its mapped to. Ah but now my sense of balance is assaulted again.  Whilst StringToObjectConverter is quite happy to accept an enum type via the Convert method it returns a string from the ConvertBack method not the original input enum type that arrived in the Convert.  Now I could address this by complicating the ConvertBack method and examining the targetType parameter etc.  However I prefer to a different approach, deriving a new EnumToObjectConverter class instead. EnumToObjectConverter using System; namespace SilverlightApplication1 {     public class EnumToObjectConverter : StringToObjectConverter     {         public override object Convert(object value, Type targetType, object parameter, System.Globalization.CultureInfo culture)         {             string key = Enum.GetName(value.GetType(), value);             return base.Convert(key, targetType, parameter, culture);         }         public override object ConvertBack(object value, Type targetType, object parameter, System.Globalization.CultureInfo culture)         {             string key = (string)base.ConvertBack(value, typeof(String), parameter, culture);             return Enum.Parse(targetType, key, false);         }     } }   This is a more belts and braces solution with specific use of Enum.GetName and Enum.Parse.  Whilst its more explicit in that the a developer has to  choose to use it, it is only really necessary when using TwoWay binding, in OneWay binding the base StringToObjectConverter would serve just as well. The observant might note that there is actually no “Generic” aspect to this solution in the end.  The use of a ResourceDictionary eliminates the need for that.

    Read the article

  • How should I structure my urls for both SEO and localization?

    - by artlung
    When I set up a site in multiple languages, how should I set up my urls for search engines and usability? Let's say my site is www.example.com, and I'm translating into French and Spanish. What is best for usability and SEO? Directory option: http://www.example.com/sample.html http://www.example.com/fr/sample.html http://www.example.com/es/sample.html Subdomain option: http://www.example.com/sample.html http://fr.example.com/sample.html http://es.example.com/sample.html Filename option: http://www.example.com/sample.html http://www.example.com/sample.fr.html http://www.example.com/sample.es.html Accept-Language header: Or should I simply parse the Accept-Language header and generate content server-side to suit that header? Is there another way to do this? If the different language versions don't have different urls, what do I do about the search engines?

    Read the article

  • Formatting and installing Ubuntu over Windows 7 without admin password

    - by Emiliano Pasqualetti
    A borrowed a friend workstation with Windows 7 and Ubuntu 10.10. I do not have the administrator password and I would like to format everything and install Ubuntu 12.04. I have put the iso on the usb using PendriveLinux and restarted the machine. At the boot I am presented with several options but those Ubuntu ones simply load my friends Ubuntu 10.10 OS (for which I have no password). edit: I fixed this by pressing F12 at the boot before it gave me that prompt (called GRUB). I clicked USB and finally install.

    Read the article

  • Error MSB4019: The imported project "C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk

    - by Tim Huffam
    This error occurred on our TFS2008 build server which we had upgraded to cater for VS2010 projects (by installing VS2010 on the build server - see this article). Error MSB4019: The imported project "C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk. However - although we had installed VS2010 on the build server - we had not installed the web development components (Visual Web Developer) - this is what caused the error. To fix - simply add the web development components: Go into Control Panel - Add or Remove Programs Select Microsoft Visual Studio 2010, and click on Change/Remove In the VS Maintenance Mode screens, select Add or Remove Features In the Setup - Options page make sure 'Visual Web Developer' is checked. Click on Update.   You shouldn't need to restart your build service. HTH Tim

    Read the article

  • How can I get my dynamic site search results content indexed by Google?

    - by Kris
    I have a site that is simply a search box to search a cloud-hosted database of .tiff images, and then all of my content can only be accessed by entering a search term. So for example, you're on the home page www.example.com and you type in "search" to the box and hit submit. Then it takes you to www.example.com/?q=search, which is a page of all my .tiff images with "search" in the description. How can I get a page like www.example.com/?q=search indexed, WITHOUT making a humungous list of search terms that people might type in?? I know about mod_rewrite, but it seems like for that you need to know ahead of time which URLs you'll need to convert, which I don't. All of these pages will be dynamically user-generated by typing into the search field. Please help!

    Read the article

  • How to stop Thunar being default file browser

    - by Charles Kane
    On a relatively new 11.04 installation Thunar has become the default file browser simply by using it! Whilst I can open Nautilus easily enough, I'd rather it remained as the default, especially when I choose to view files in dual pane. The only action that I can pinpoint that might have given over my files and folders to Thunar is making Nautilus into Nautilus-Elementary (oh and Unity carked it so I reverted rather unwillingly to Classic, glad I did it is so much more stable and this is my production machine and Unity acts as if it is early alpha as far as I can tell!)

    Read the article

  • EU Digital Agenda scores 85/100

    - by trond-arne.undheim
    If the Digital Agenda was a bottle of wine and I were wine critic Robert Parker, I would say the Digital Agenda has "a great bouquet, many good elements, with astringent, dry and puckering mouth feel that will not please everyone, but still displaying some finesse. A somewhat controlled effort with no surprises and a few noticeable flaws in the delivery. Noticeably shorter aftertaste than advertised by the producers. Score: 85/100. Enjoy now". The EU Digital Agenda states that "standards are vital for interoperability" and has a whole chapter on interoperability and standards. With this strong emphasis, there is hope the EU's outdated standardization system finally is headed for reform. It has been 23 years since the legal framework of standardisation was completed by Council Decision 87/95/EEC8 in the Information and Communications Technology (ICT) sector. Standardization is market driven. For several decades the IT industry has been developing standards and specifications in global open standards development organisations (fora/consortia), many of which have transparency procedures and practices far superior to the European Standards Organizations. The Digital Agenda rightly states: "reflecting the rise and growing importance of ICT standards developed by certain global fora and consortia". Some fora/consortia, of course, are distorted, influenced by single vendors, have poor track record, and need constant vigilance, but they are the minority. Therefore, the recognition needs to be accompanied by eligibility criteria focused on openness. Will the EU reform its ICT standardization by the end of 2010? Possibly, and only if DG Enterprise takes on board that Information and Communications Technologies (ICTs) have driven half of the productivity growth in Europe over the past 15 years, a prominent fact in the EU's excellent Digital Competitiveness report 2010 published on Monday 17 May. It is ok to single out the ICT sector. It simply is the most important sector right now as it fuels growth in all other sectors. Let's not wait for the entire standardization package which may take another few years. Europe does not have time. The Digital Agenda is an umbrella strategy with deliveries from a host of actors across the Commission. For instance, the EU promises to issue "guidance on transparent ex-ante disclosure rules for essential intellectual property rights and licensing terms and conditions in the context of standard setting", by 2011 in the Horisontal Guidelines now out for public consultation by DG COMP and to some extent by DG ENTR's standardization policy reform. This is important. The EU will issue procurement guidance as interoperability frameworks are put into practice. This is a joint responsibility of several DGs, and is likely to suffer coordination problems, controversy and delays. We have seen plenty of the latter already and I have commented on the Commission's own interoperability elsewhere, with mixed luck. :( Yesterday, I watched the cartoonesque Korean western film The Good, the Bad and the Weird. In the movie (and I meant in the movie only), a bandit, a thief, and a bounty hunter, all excellent at whatever they do, fight for a treasure map. Whether that is a good analogy for the situation within the Commission, others are better judges of than I. However, as a movie fanatic, I still await the final shoot-out, and, as in the film, the only certainty is that "life is about chasing and being chased". The missed opportunity (in this case not following up the push from Member States to better define open standards based interoperability) is a casualty of the chaos ensued in the European Wild West (and I mean that in the most endearing sense, and my excuses beforehand to actors who possibly justifiably cannot bear being compared to fictional movie characters). Instead of exposing the ongoing fight, the EU opted for the legalistic use of the term "standards" throughout the document. This is a term that--to the EU-- excludes most standards used by the IT industry world wide. So, while it, for a moment, meant "weapon down", it will not lead to lasting peace. The Digital Agenda calls for the Member States to "Implement commitments on interoperability and standards in the Malmö and Granada Declarations by 2013". This is a far cry from the actual Ministerial Declarations which called upon the Commission to help them with this implementation by recognizing and further defining open standards based interoperability. Unless there is more forthcoming from the Commission, the market's judgement will be: you simply fall short. Generally, I think the EU focus now should be "from policy to practice" and the Digital Agenda does indeed stop short of tackling some highly practical issues. There is need for progress beyond the Digital Agenda. Here are some suggestions that would help Europe re-take global leadership on openness, public sector reform, and economic growth: A strong European software strategy centred around open standards based interoperability by 2011. An ambitious new eCommission strategy for 2011-15 focused on migration to open standards by 2015. Aligning the IT portfolio across the Commission into one Digital Agenda DG by 2012. Focusing all best practice exchange in eGovernment on one social networking site, epractice.eu (full disclosure: I had a role in getting that site up and running) Prioritizing public sector needs in global standardization over European standardization by 2014.

    Read the article

  • Trace File Source Adapter

    The Trace File Source adapter is a useful addition to your SSIS toolbox.  It allows you to read 2005 and 2008 profiler traces stored as .trc files and read them into the Data Flow.  From there you can perform filtering and analysis using the power of SSIS. There is no need for a SQL Server connection this just uses the trace file. Example Usages Cache warming for SQL Server Analysis Services Reading the flight recorder Find out the longest running queries on a server Analyze statements for CPU, memory by user or some other criteria you choose Properties The Trace File Source adapter has two properties, both of which combine to control the source trace file that is read at runtime. SQL Server 2005 and SQL Server 2008 trace files are supported for both the Database Engine (SQL Server) and Analysis Services. The properties are managed by the Editor form or can be set directly from the Properties Grid in Visual Studio. Property Type Description AccessMode Enumeration This property determines how the Filename property is interpreted. The values available are: DirectInput Variable Filename String This property holds the path for trace file to load (*.trc). The value is either a full path, or the name of a variable which contains the full path to the trace file, depending on the AccessMode property. Trace Column Definition Hopefully the majority of you can skip this section entirely, but if you encounter some problems processing a trace file this may explain it and allow you to fix the problem. The component is built upon the trace management API provided by Microsoft. Unfortunately API methods that expose the schema of a trace file have known issues and are unreliable, put simply the data often differs from what was specified. To overcome these limitations the component uses  some simple XML files. These files enable the trace column data types and sizing attributes to be overridden. For example SQL Server Profiler or TMO generated structures define EventClass as an integer, but the real value is a string. TraceDataColumnsSQL.xml  - SQL Server Database Engine Trace Columns TraceDataColumnsAS.xml    - SQL Server Analysis Services Trace Columns The files can be found in the %ProgramFiles%\Microsoft SQL Server\100\DTS\PipelineComponents folder, e.g. "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsSQL.xml" "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsAS.xml" If at runtime the component encounters a type conversion or sizing error it is most likely due to a discrepancy between the column definition as reported by the API and the actual value encountered. Whilst most common issues have already been fixed through these files we have implemented specific exception traps to direct you to the files to enable you to fix any further issues due to different usage or data scenarios that we have not tested. An example error that you can fix through these files is shown below. Buffer exception writing value to column 'Column Name'. The string value is 999 characters in length, the column is only 111. Columns can be overridden by the TraceDataColumns XML files in "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsAS.xml". Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Finally you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Trace File Source transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations. Please note that the Microsoft Trace classes used in the component are not supported on 64-bit platforms. To use the Trace File Source on a 64-bit host you need to ensure you have the 32-bit (x86) tools available, and the way you execute your package is setup to use them, please see the help topic 64-bit Considerations for Integration Services for more details. Downloads Trace Sources for SQL Server 2005 -- Trace Sources for SQL Server 2008 Version History SQL Server 2008 Version 2.0.0.382 - SQL Sever 2008 public release. (9 Apr 2009) SQL Server 2005 Version 1.0.0.321 - SQL Server 2005 public release. (18 Nov 2008) -- Screenshots

    Read the article

  • What is the greatest design flaw you have faced in any programming language?

    - by Anto
    All programming languages are having their design flaws simply because not a single language can be perfect, just as with most (all?) other things. That aside, which design fault in a programming language has annoyed you the most through your history as a programmer? Note that if a language is "bad" just because it isn't designed for a specific thing isn't a design flaw, but a feature of design, so don't list such annoyances of languages. If a language is illsuited for what it is designed for, that is of course a flaw in the design. Implementation specific things and under the hood things do not count either.

    Read the article

  • Eager Loading more than 1 table in LinqtoSql

    - by Michael Freidgeim
    When I've tried in Linq2Sql to load table with 2 child tables, I've noticed, that multiple SQLs are generated. I've found that  it isa known issue, if you try to specify more than one to pre-load it just  picks which one to pre-load and which others to leave deferred (simply ignoring those LoadWith hints)There are more explanations in http://codebetter.com/blogs/david.hayden/archive/2007/08/06/linq-to-sql-query-tuning-appears-to-break-down-in-more-advanced-scenarios.aspxThe reason the relationship in your blog post above is generating multiple queries is that you have two (1:n) relationship (Customers->Orders) and (Orders->OrderDetails). If you just had one (1:n) relationship (Customer->Orders) or (Orders->OrderDetails) LINQ to SQL would optimize and grab it in one query (using a JOIN).  The alternative -to use SQL and POCO classes-see http://stackoverflow.com/questions/238504/linq-to-sql-loading-child-entities-without-using-dataloadoptions?rq=1Fortunately the problem is not applicable to Entity Framework, that we want to use in future development instead of Linq2SqlProduct firstProduct = db.Product.Include("OrderDetail").Include("Supplier").First(); ?

    Read the article

  • Rails solution for mobile-specific content filter?

    - by Damien Roche
    To note, I'm not interested in simply 'hiding' content for mobile devices, I want to filter out that content completely. I'm also not trying to address the issue by building a mobile specific interface (mob.example.com). There was another question regarding something similar: How do I prevent useless content load on the page in responsive design? The solution, in that post, was to set a session during the initial request, and then use the session to filter content on subsequent requests. I primarily develop in Rails, and I'm wondering if there are any gems or ruby-specific solutions to this problem?

    Read the article

  • Bump version before kicking off new development or when tagging a release, which is better?

    - by linquize
    Some projects bump version before kicking off a new development, while the other projects bump version when tagging a release. Which approach is better? If version number not changed at the start of new phase, the developers may forget to change it and simply release the program. If version number changed before tagging release, then 2 the version numbers (tag and Makefile/AssemblyInfo.cs) do not match. git describe may give you v1.2.3.4-15-g1234567 if current revision is after v1.2.3.4, but you have already changed the files to have v1.2.3.5

    Read the article

  • How do you handle developer that has taken an "early retirement"?

    - by Amir Rezaei
    I have worked in many projects and have notice some people just refuse and have no interest in learning new technology. They simply look down to every simple tool and technology. It’s hard to understand how they got here at first place. I have understanding for time for family and social activities. But I don’t understand the lack of any single interest. It’s kind of being in wrong business. I have read this question and I think the problem is the people. How do you handle a developer that has taken "early retirement" (unwilling to learn)? How do you motivate them? What is the term for people who refuses to learn new technology?

    Read the article

  • Scaling and new coordinates based off screen resolution

    - by Atticus
    I'm trying to deal with different resolutions on Android devices. Say I have a sprite at X,Y and dimensions W,H. I understand how to properly scale the width and heigh dimensions depending on the screen size, but I am not sure how to properly reposition this sprite so that it exists in the same area that it should. If I simply resize the width and heigh and keep X,Y at the same coordinates, things don't scale well. How do I properly reposition? Multiply the coordinates by the scale as well?

    Read the article

< Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >