Search Results

Search found 17285 results on 692 pages for 'incremental build'.

Page 17/692 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Build Your Own CE6 Kernel

    - by Kate Moss' Big Fan
    The Share Source Program in Windows CE provides many modules in %_WINCEROOT%\Private\ tree, and the kernel is one of them! Although it is not full source of kernel but it is good enough for tracing it, even tweak the kernel. Tracing the kernel and see how it works is lots of fun, but it is fascinated to modify and verify the change you made. So first comes first, where is the source of kernel? It's in your %_WINCEROOT%\private\winceos\COREOS\nk\ And next question will be "How do I build it?", Some of you may say just "build -c" there and it should be good. If you are the owner of kernel and got full source, that is definitely the right answer, but none of them are applied to our case though. So what should I do? Let's dig deeper into the coreos\nk folder, there are a couples of subfolder, CELOG, KDSTUB, KERNEL and etc. KERNEL\ is the main component of kernel.dll, in the other word, most of the modify to kernel is going to happen here. And the good thing is, you could "build -c" in %_WINCEROOT%\private\winceos\COREOS\nk\kernel\ with no error at all. But before doing that, remember to backup eveything you are going to modify, including the source and binaries; remember, this is not something belong to you, and if you didn't restore them back later, it could end up confuse the subsequence QFE updates! Here is the steps Backup the source code, I will suggest the whole %_WINCEROOT%\private\winceos\COREOS\nk\ Backup the binaries in common\oak\lib\, and again if you are not sure which files, backup the whole %_WINCEROOT%\common\oak\lib\ is the safest way. Do whatever modification you want in %_WINCEROOT%\private\winceos\COREOS\nk\kernel\ build -c in %_WINCEROOT%\private\winceos\COREOS\nk\kernel If everything went well so far, you should get a new nkmain.lib,nkmain.pdb, nkprmain.lib and nkprmain.pdb in %_WINCEROOT%\public\common\oak\lib\%_TGTCPU%\%WINCEDEBUG%\ Basically, you just rebuild your new kernel, the rest is to "blddemo clean -q" to have your new kernel SYSGEN'd and include in your OS Image. Or just "set WINCEREL=1" then "sysgen -p common nk nkprof" and "makeimg" if you can't wait another minutes for "blddemo clean -q" Tat sounds good, but some of you may not like the idea to alter any code in private folder, and not to mention how annoying to backup/restore files every time. Better idea? Yes, Microsoft provides a tool SYSGEN_CAPTURE (http://msdn.microsoft.com/en-us/library/ee504678.aspx for detail and usage) to creates Sources files for public drivers that you want to modify and build in your platform directory. In fact, not only public drivers, virtually anything in the %_WINCEROOT%\public\<project name>\cesysgen\makefile can be captured, and of course including kernel. So I am going to introduce a second way to build your own kernel by using SYSGEN_CAPTURE tool. Again the steps Create a folder in your BSP for building kernel, says %_TARGETPLATROOT%\SRC\Kernel. Use "SYSGEN_CAPTURE -p common nk" and then you will get a SOURCES.KERN, you could also "SYSGEN_CAPTURE -p common nkprof" to generate profiler enabled kernel. rename the SOURCE.KERN to SOURCES and copy one of the sample makefile into your kernel directory. For example the one in PRIVATE\WINCEOS\COREOS\NK\KERNEL\NKNORMAL. Copy the source files you want to modify from private\winceos\coreos\nk\kernel\ into your kernel directory. Modifying the SOURCES= macro to the source files you addes in step 4. For example, if you copied the vm.c, it is going to be SOURCES=vm.c Refer to the private\winceos\COREOS\nk\kernel\sources.inc and add macro defines and proper include path in your SOURCES file. "set WINCEREL=1", "build -c" in your kernel directory and "makeimg", voila! Here is an example for the MACROS you need to add in x86 Here are the macros for x86 CDEFINES=$(CDEFINES) -DIN_KERNEL -DWINCEMACRO -DKERN_CORE # Machine independent defines CDEFINES=$(CDEFINES) -DDBGSUPPORT _COREOSROOT=$(_WINCEROOT)\private\winceos\coreos INCLUDES=$(_COREOSROOT)\inc;$(_COREOSROOT)\nk\inc !IFDEF DP_SETTINGS CDEFINES=$(CDEFINES) -DDP_SETTINGS=$(DP_SETTINGS) !ENDIF ASM_SAFESEH=1 CDEFINES=$(CDEFINES) -Gs100000 -DENCODE_GS_COOKIE

    Read the article

  • Why does Maven have such a bad rep?

    - by Dan
    There is a lot of talk on the internet about how Maven is bad. I have been using some features of Maven for a few years now and the most important benefit in my view is the dependency management. Maven documentation is less than adequate, but generally when I need to accomplish something I figure it once and than it works (for example, I remember when I implemented signing the jars.) I don’t think that Maven is great, but it does solve some problems that without it would be a genuine pain. So, why does Maven has such a bad rep and what problems with Maven can I expect in the future? Maybe there are much better alternatives that I don't know about? (For example, I never looked Ivy in detail.) NOTE: This is not an attempt to cause an argument. It is an attempt to clear the FUD.

    Read the article

  • In Netbeans, how do I avoid wsimport rebuilding web service clients every build?

    - by gustafc
    I'm on a project where we use NetBeans (6.8). We use several different web services, which we have added as web service references, and Netbeans auto-generates the Ant wsimport scripts for us. Very handy, with one drawback: The web service clients are recompiled every time ant is invoked. This slows down the build process considerably and has caused the number of sword-related injuries, maimings and deaths to skyrocket. Normally, I'd fix this by changing the wsimport element from <wsimport sourcedestdir="${build.generated.dir}/jax-wsCache/PonyService" destdir="${build.generated.dir}/jax-wsCache/PonyService" wsdl="${wsdl-PonyService}" catalog="catalog.xml" verbose="true"/> to <wsimport sourcedestdir="${build.generated.dir}/jax-wsCache/PonyService" destdir="${build.generated.dir}/jax-wsCache/PonyService" wsdl="${wsdl-PonyService}" catalog="catalog.xml" verbose="true"> <produces dir="${build.generated.dir}/jax-wsCache/PonyService" /> </wsimport> But I can't, 'cause this part of the Ant script is auto-generated. If I right-click the PonyService web service reference and select Edit Web Service Attributes ⇒ wsimport options, I can add attributes to the wsimport element, but not child elements. So: How do I add the produces child element to wsimport other than hacking the auto-generated Ant script? Or more generally: How do I make the NetBeans-generated wsimport not recompile the web service clients every time I build?

    Read the article

  • How to add system property equivalent to java -D in Ant

    - by Shervin
    Hi. I need to set java -Djava.library.path=/some/path and I want to do it when I am running my ant script, building my jar. I think I have to use <sysproperty key="java.library.path" value="/some/path"/> but it doesnt work. I cannot make the syntax work. The only thing I have Googled and found is sysproperty in conjunction with <java classname> but that doesnt make any sense to me. I am not sure if this is relevant, but I am using ant to create a ear and deploying this ear in JBoss.

    Read the article

  • Workaround: build FBX in XNA raise OutOfMemoryException

    - by Vitus
    If you try to add large FBX 3D model to the XNA project, and build it, you can get an OutOfMemoryException build error like following: Error    1    Building content threw OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.    at System.Collections.Generic.List`1.set_Capacity(Int32 value)    at System.Collections.Generic.List`1.EnsureCapacity(Int32 min)    at System.Collections.Generic.List`1.InsertRange(Int32 index, IEnumerable`1 collection)    at Microsoft.Xna.Framework.Content.Pipeline.Graphics.VertexChannel`1.InsertRange(Int32 index, Int32 count)    at Microsoft.Xna.Framework.Content.Pipeline.Graphics.VertexContent.InsertRange(Int32 index, IEnumerable`1 positionIndexCollection)    at Microsoft.Xna.Framework.Content.Pipeline.Graphics.MeshBuilder.AddTriangleVertex(Int32 indexIntoVertexCollection)    at Microsoft.Xna.Framework.Content.Pipeline.MeshConverter.FillNodeWithInfoFromMesh(KFbxNode* fbxNode, String name, KFbxGeometryConverter* geometryConverter)    at Microsoft.Xna.Framework.Content.Pipeline.FbxImporter.ProcessInformationInNode(KFbxNode* fbxNode, String name, Boolean* partOfMainSkeleton, Boolean* warnIfBoneButNotChild)    at Microsoft.Xna.Framework.Content.Pipeline.FbxImporter.ProcessNode(ValueType parentAbsoluteTransform, NodeContent potentialParent, KFbxNode* fbxNode, Boolean partOfMainSkeleton, Boolean warnIfBoneButNotChild)    at Microsoft.Xna.Framework.Content.Pipeline.FbxImporter.ProcessNode(ValueType parentAbsoluteTransform, NodeContent potentialParent, KFbxNode* fbxNode, Boolean partOfMainSkeleton, Boolean warnIfBoneButNotChild)    at Microsoft.Xna.Framework.Content.Pipeline.FbxImporter.Import(String filename, ContentImporterContext context)    at Microsoft.Xna.Framework.Content.Pipeline.ContentImporter`1.Microsoft.Xna.Framework.Content.Pipeline.IContentImporter.Import(String filename, ContentImporterContext context)    //additional calls here …   My desktop PC have 8Gb RAM, and Visual Studio’s process devenv.exe use under 2Gb of it while build process (about 3.5-4Gb of RAM is always free). It’s obvious, that VS can’t address more than 2Gb of RAM, and when that limit is over, build process is fail. OS on my PC is Win x64,  so I “charge” devenv.exe by using editbin.exe utility – in the VS Command prompt I run following: editbin "C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe" /LARGEADDRESSAWARE This command edits the image to indicate that the application can handle addresses larger than 2 gigabytes. After that FBX file successfully built! Of course, you must put proper path to devenv.exe, depend on your installation path. If you are on Win x86, you need to do additional action – more info here.   P.S.: although now you can build a bigger files, than usual, keep in mind, that XNA have some restrictions on vertex buffer size etc., depend on your current XNA project profile (Reach or HiDef). And if your model’s vertexbuffer size more than 64Mb (with Reach profile), that model can’t be built and raise an error.

    Read the article

  • In Netbeans+Ant, how do I avoid wsimport rebuilding web service clients every build?

    - by gustafc
    I'm on a project where we use NetBeans (6.8). We use several different web services, which we have added as web service references, and Netbeans auto-generates the Ant wsimport scripts for us. Very handy, with one drawback: The web service clients are recompiled every time ant is invoked. This slows down the build process considerably and has caused the number of sword-related injuries, maimings and deaths to skyrocket. Normally, I'd fix this by changing the wsimport element from <wsimport sourcedestdir="${build.generated.dir}/jax-wsCache/PonyService" destdir="${build.generated.dir}/jax-wsCache/PonyService" wsdl="${wsdl-PonyService}" catalog="catalog.xml" verbose="true"/> to <wsimport sourcedestdir="${build.generated.dir}/jax-wsCache/PonyService" destdir="${build.generated.dir}/jax-wsCache/PonyService" wsdl="${wsdl-PonyService}" catalog="catalog.xml" verbose="true"> <produces dir="${build.generated.dir}/jax-wsCache/PonyService" /> </wsimport> But I can't, 'cause this part of the Ant script is auto-generated. If I right-click the PonyService web service reference and select Edit Web Service Attributes ⇒ wsimport options, I can add attributes to the wsimport element, but not child elements. So: How do I add the produces child element to wsimport other than hacking the auto-generated Ant script? Or more generally: How do I make the NetBeans-generated wsimport not recompile the web service clients every time I build?

    Read the article

  • Custom Team Build Template for Microsoft Dynamics NAV in TFS 2010

    - by ssmantha
    To cook this recipe you need the following ingredients: 1) An installation of TFS 2010 Team Build Service on a server 2) Visual Studio 2010 for cooking 3) Use the following Hints on the web: a)  http://www.codeproject.com/KB/library/AutoupateNAV.aspx – use this wrapper to perform the basic tasks b) http://www.richard-banks.org/2010/11/how-to-build-linux-code-with-tfs-2010.html – for ideas on how to customize the build templates   And finally lot of patience and luck, took me about 120 failed builds to get the first one right!!   Please feel free to ask questions, I would be happy to help!!

    Read the article

  • Install build-essentials in ubuntu 12.04

    - by Mukul Shukla
    After I install a fresh copy of ubuntu and I need to install build-essentials, I have to type: sudo apt-get update and sudo apt-get upgrade before installing build-essentials These two commands take a LOOTTT of time and install many things. Is there a way to install build-essentials without running these two commands, or a way that these two commands don't install all the updates and hence will take less time.

    Read the article

  • run buildbot on Windows XP

    - by chrmue
    I recently stumbled over buildbot and wanted to give it a try. My problem is that I have to run it under Windows because we don't use Linux on workstations or servers in my company. I've already tried different installations: python 2.6, Twisted-9.0.0-py2.6, buildbot 0.7.12 python 2.6, pywin32-214-py2.6, Twisted-9.0.0-py2.6, buildbot 0.7.12 python 2.4, pywin32-214-py2.4, Twisted-9.0.0-py2.4, buildbot 0.7.12 and tried to run it in a Windows XP VM. In all installations I ran the buildbot test suite and got several errors and the buildbot documentation sais that no test should fail. Does anybody here have experience with buildbot under Windows? Is it worth the pain or do I have to use Linux?

    Read the article

  • MSBuild file for deployment process

    - by Lee Englestone
    I could do with some pointers, code examples or references that may help me do the following in an msbuild file to help speed up the deployment process.. This scenario involves getting a developers 'local' version onto a 'development' server.. Increment a developers local Web Applications Assembly version number Publish a developers local Web Application files somewhere .rar the publsihed files or folder into the format v[IncrementedAssemblyNumber].rar Copy the .rar to somewhere Backup (.rar) the existing live website folder (located elsewhere) in the format Pre_v[IncrementedAssemblyNumber].rar Move the backed up .rar to a /Backup folder. Overwrite the development web files with the published local web files Should be simple for all those MSBUILD Gurus out there. Like I said, answers or 'Good and applicable' links would be much appreciated. Also i'm thinking of getting one of the MSbuild books. From what I can tell there are 2, possibly 3 contenders. I am not using TFS. Can anyone recommend a book for beginning MSBUILD? Ideally from people that have read more than one book on the subject. Cheers, -- Lee

    Read the article

  • Build Your Own Adapter For Cheap Mains Power on Portable Devices

    - by Jason Fitzpatrick
    If you’re looking for a way to build a battery-to-wall-power adapter for one of your portable devices, this tutorial can serve as a template for your DIY adventures. Mike Worth wanted an outlet adapter for his Canon camera, but Canon wanted $75 for it. Not looking to spend that kind of cash on a very simple adapter, he set out to build his own. The build is quite simple, consisting of a transformer with the proper voltage, and a set of dummy battery casings with thumb tacks and washers to serve as the negative and positive leads. Hit up the link below to see the full build. Making a Mains Adapter [via Hack A Day] HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • How to make distributed builds using XCode 3.2 on OS X 10.6

    - by Sorin Sbarnea
    After I upgraded using a clean install from OS X 10.5 to 10.6.2 and upgraded the XCode to 3.2.1 I wasn't able to use distributed builds feature anymore. There are several issues that I detected: In most cases Bonjour is not detecting the other computers even they are on the same switch. I added a custom 'set' where I added manually the IP addreses of each computer. Even so I still get status: "unreachable" on them.BTW, ping does work without problems. Both share my computer for shared workgroup builds (distcc) and distribute builds via shared workgroup builds options are checked.

    Read the article

  • Maven: add a dependency to a jar by relative path

    - by flybywire
    I have a proprietary jar that I want to add to my pom as a dependency. But I don't want to add it to a repository. The reason is that I want my usual maven commands such as mvn compile, etc, to work out of the box. (Without demanding from the developers a to add it to some repository by themselves). I want the jar to be in a 3rdparty lib in source control, and link to it by relative path from the pom.xml file. Can this be done? How?

    Read the article

  • iOS build machine setup: problem with certificates

    - by cbrulak
    some background: work with multiple team mates each work on our own MBP I'm setting a build machine that we can git push to in order to generate a build (aim to allow anyone to push to the build machine and then generate an archive, upload to testflight and send on its way) problem: getting my apple developer certificates on the build machine. I installed Lion, XCode, etc and I signed into my developer account through Xcode organizer, provisioning profiles download,etc. but beside each one it says: valid signing identity not found I also download my certificate from the developer.apple.com page, imported it into keychain, etc but no luck. Anyone else have a similar issue? Or maybe hints to fix? Thanks

    Read the article

  • Toolset agnostic build server and Silverlight projects

    - by Marko Apfel
    Problem Normally I try to have my continuous integration as most a possible toolset free to ensure that no local stuff could have an impact to my build. My Silverlight app references a special compile target in a folder outside my developer tree: <Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" /> So I copied the stuff from this folder to a local one and changed the call to this target in my csproj: <Import Project="..\..\..\tools\WebApplications\Microsoft.WebApplication.targets" /> And now Visual Studio Conversion Wizard welcomes my with this: Solution Regardless of which line I write – this conversion comes back again and again, if the line has another form than <Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" /> So it seems that there is no simple way to change this behaviour. Workaraound I must accept, that this line must be in the csproj and to run the build the toolset must be copied to the build server at the correct location. So go to your development machine where Visual Studio is installed and copy the folder “C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications” to your build server at the equivalent location.   Xmas wishes to Microsoft: Please provide technologies to let us developers bundle all needed stuff for a project in one developer tree. It should be possible that one checkout starts us up! No additional installations regardless whether it is a developing machine or dedicated build or continuous integration server. Silverlight is only one example, code analysis configurations could also be terrible and much more …

    Read the article

  • Launchpad fails to build a package for my PPA

    - by AZorin
    I'm trying to build a package on Launchpad's Debian build system for PPAs but I'm having some issues with a certain package. The package I'm trying to build (zorin-xwinwrap) contains a source C file which I'm trying to get to compile and build on Launchpad's server so that it would install and work on 32 bit (i386) and 64 bit (amd64) systems. Unfortunately I keep on getting an Error code 2 with the debian/rules file and I have no clue how to fix this issue. The following link is the source package of the software I'm trying to add to my PPA: http://goo.gl/GjZvd The following link is the buildlog for the failed package on Launchpad: http://goo.gl/6A2rQ I would greatly appreciate any suggestions if anyone may have any. Thank you for your time.

    Read the article

  • Do you use Phing?

    - by Sam McAfee
    Does anyone use Phing to deploy PHP applications, and if so how do you use it? We currently have a hand-written "setup" script that we run whenever we deploy a new instance of our project. We just checkout from SVN and run it. It sets some basic configuration variables, installs or reloads the database, and generates a v-host for the site instance. I have often thought that maybe we should be using Phing. I haven't used ant much, so I don't have a real sense of what Phing is supposed to do other than script the copying of files from one place to another much as our setup script does. What are some more advanced uses that you can give examples of to help me understand why we would or would not want to integrate Phing into our process.

    Read the article

  • How do I build mDNSResponder?

    - by Alex
    I have tried checking out the mDNSResponder source from Apple's SVN host, with the thought of compiling it and tweaking it. This failed miserably. Here is the last line of the output of cd trunk SRCROOT=. make I get the same error for several tags in the SVN tree, so I'm not sure if there is something on my end wrong? The following build commands failed: mDNSResponder: CompileC mDNSResponder.build/mDNSResponder.build/Objects-normal/i386/mDNSMacOSX.o /Users/myname/Desktop/mDNSResponder/trunk/mDNSMacOSX/mDNSMacOSX.c normal i386 c com.apple.compilers.gcc.4_2 PhaseScriptExecution "Run Script" /Users/myname/Desktop/mDNSResponder/trunk/mDNSMacOSX/mDNSResponder.build/mDNSResponder.build/Script-D284BE6C0ADD80740027CCDF.sh mDNSResponder debug: CompileC "mDNSResponder.build/mDNSResponder debug.build/Objects-normal/i386/mDNSMacOSX.o" /Users/myname/Desktop/mDNSResponder/trunk/mDNSMacOSX/mDNSMacOSX.c normal i386 c com.apple.compilers.gcc.4_2 Build Some: PhaseScriptExecution "Run Script" "/Users/myname/Desktop/mDNSResponder/trunk/mDNSMacOSX/mDNSResponder.build/Development/Build Some.build/Script-FF045B6A0C7E4AA600448140.sh" (4 failures)

    Read the article

  • Duplicity Full Backup Lifetime and Efficiency

    - by Tim Lytle
    I'm trying to work up a backup strategy for some clients, and am leaning towards duplicity for remote backup (already use rdiff-backup for internal/on location backups). Is it reasonable to want a full backup every so often? Since duplicity increments forward, each incremental backup is relying on the previous increment, and all are relying heavily on the last full backup. Should that become corrupt, bad things happen. A related question: Does Duplicity test the incremental backups for consistency? Assuming I do want a full backup every so often, how efficiently does duplicity create that full backup? Can/does it check file signatures and copy unchanged data from previous full backups/increments? Basically creating a new 'full' archive transferring new/changed data and merging existing unchanged data? Right now my concern is that running a full backup is needed, but the consistent large bandwidth use of full backups will make this unreasonable for some clients.

    Read the article

  • Symantec NetBackup restore - Incremental backup

    - by w0051977
    We are using Net Backup as a corporate solution. Incremental backups are taken daily during the week and then a weekly backup is done at the weekend (Saturday). My colleague has restored a folder to how it stood at 14:00 on a Tuesday. The problem is that the restore is taking files from the weekend backup if they did not exist at that point in time of the restore. For example, the folder we are restoring should look like this (this is how it looked on Tuesday at 14:00): Folder1 (folder name) Test.txt Test1.txt Test2.txt The folder looked like this at the weekend when the full restore was done (even though it did exist at the weekend when the full backup ran): Folder1 (folder name) Test.txt Test1.txt Test2.txt Test3.txt The actual folder restored looks like this: Folder1 (folder name) Test.txt Test1.txt Test2.txt Test3.txt Test3.txt should not be restored because it did not exist at the point of the restore. Is there a setting somewhere that we are missing. The folder in question is 200GB - the example above is for simplification. I realise this is a basic question.

    Read the article

  • SQL SERVER – What is Incremental Statistics? – Performance improvements in SQL Server 2014 – Part 1

    - by Pinal Dave
    This is the first part of the series Incremental Statistics. Here is the index of the complete series. What is Incremental Statistics? – Performance improvements in SQL Server 2014 – Part 1 Simple Example of Incremental Statistics – Performance improvements in SQL Server 2014 – Part 2 DMV to Identify Incremental Statistics – Performance improvements in SQL Server 2014 – Part 3 Statistics are considered one of the most important aspects of SQL Server Performance Tuning. You might have often heard the phrase, with related to performance tuning. “Update Statistics before you take any other steps to tune performance”. Honestly, I have said above statement many times and many times, I have personally updated statistics before I start to do any performance tuning exercise. You may agree or disagree to the point, but there is no denial that Statistics play an extremely vital role in the performance tuning. SQL Server 2014 has a new feature called Incremental Statistics. I have been playing with this feature for quite a while and I find that very interesting. After spending some time with this feature, I decided to write about this subject over here. New in SQL Server 2014 – Incremental Statistics Well, it seems like lots of people wants to start using SQL Server 2014′s new feature of Incremetnal Statistics. However, let us understand what actually this feature does and how it can help. I will try to simplify this feature first before I start working on the demo code. Code for all versions of SQL Server Here is the code which you can execute on all versions of SQL Server and it will update the statistics of your table. The keyword which you should pay attention is WITH FULLSCAN. It will scan the entire table and build brand new statistics for you which your SQL Server Performance Tuning engine can use for better estimation of your execution plan. UPDATE STATISTICS TableName(StatisticsName) WITH FULLSCAN Who should learn about this? Why? If you are using partitions in your database, you should consider about implementing this feature. Otherwise, this feature is pretty much not applicable to you. Well, if you are using single partition and your table data is in a single place, you still have to update your statistics the same way you have been doing. If you are using multiple partitions, this may be a very useful feature for you. In most cases, users have multiple partitions because they have lots of data in their table. Each partition will have data which belongs to itself. Now it is very common that each partition are populated separately in SQL Server. Real World Example For example, if your table contains data which is related to sales, you will have plenty of entries in your table. It will be a good idea to divide the partition into multiple filegroups for example, you can divide this table into 3 semesters or 4 quarters or even 12 months. Let us assume that we have divided our table into 12 different partitions. Now for the month of January, our first partition will be populated and for the month of February our second partition will be populated. Now assume, that you have plenty of the data in your first and second partition. Now the month of March has just started and your third partition has started to populate. Due to some reason, if you want to update your statistics, what will you do? In SQL Server 2012 and earlier version You will just use the code of WITH FULLSCAN and update the entire table. That means even though you have only data in third partition you will still update the entire table. This will be VERY resource intensive process as you will be updating the statistics of the partition 1 and 2 where data has not changed at all. In SQL Server 2014 You will just update the partition of Partition 3. There is a special syntax where you can now specify which partition you want to update now. The impact of this is that it is smartly merging the new data with old statistics and update the entire statistics without doing FULLSCAN of your entire table. This has a huge impact on performance. Remember that the new feature in SQL Server 2014 does not change anything besides the capability to update a single partition. However, there is one feature which is indeed attractive. Previously, when table data were changed 20% at that time, statistics update were triggered. However, now the same threshold is applicable to a single partition. That means if your partition faces 20% data, change it will also trigger partition level statistics update which, when merged to your final statistics will give you better performance. In summary If you are not using a partition, this feature is not applicable to you. If you are using a partition, this feature can be very helpful to you. Tomorrow: We will see working code of SQL Server 2014 Incremental Statistics. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SQL Statistics, Statistics

    Read the article

  • How do I install PyYAML into local install of Python?

    - by Daryl Spitzer
    I've installed Python 2.6.4 into (a subdirectory in) my home directory on a Linux machine with Python 2.3.4 pre-installed, because I need to run some code that I've decided would require too much work to make it run on 2.3.4. (I'm not on the sudoers list for that machine.) I was hoping I could run ~/Python-2.6.4/python setup.py install (from the PyYAML directory in my home directory, where I untarred the PyYAML sources) and it would be smart enough to install it into my local Python 2.6.4 install. But it's not. (See the P.S.) Is it possible to install PyYAML into my local Python install, so that "import yaml" will work when I invoke that Python? If so, how do I do that? P.S. Here's the output when I ran ~/Python-2.6.4/python setup.py install: running install running build running build_py creating build/lib.linux-ppc64-2.6 creating build/lib.linux-ppc64-2.6/yaml copying lib/yaml/composer.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/nodes.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/dumper.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/resolver.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/events.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/emitter.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/error.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/loader.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/cyaml.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/scanner.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/__init__.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/serializer.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/reader.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/representer.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/constructor.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/tokens.py -> build/lib.linux-ppc64-2.6/yaml copying lib/yaml/parser.py -> build/lib.linux-ppc64-2.6/yaml running build_ext creating build/temp.linux-ppc64-2.6 checking if libyaml is compilable gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/home/dspitzer/Python-2.6.4/Include -I/home/dspitzer/Python-2.6.4 -c build/temp.linux-ppc64-2.6/check_libyaml.c -o build/temp.linux-ppc64-2.6/check_libyaml.o build/temp.linux-ppc64-2.6/check_libyaml.c:2:18: yaml.h: No such file or directory build/temp.linux-ppc64-2.6/check_libyaml.c: In function `main': build/temp.linux-ppc64-2.6/check_libyaml.c:5: error: `yaml_parser_t' undeclared (first use in this function) build/temp.linux-ppc64-2.6/check_libyaml.c:5: error: (Each undeclared identifier is reported only once build/temp.linux-ppc64-2.6/check_libyaml.c:5: error: for each function it appears in.) build/temp.linux-ppc64-2.6/check_libyaml.c:5: error: syntax error before "parser" build/temp.linux-ppc64-2.6/check_libyaml.c:6: error: `yaml_emitter_t' undeclared (first use in this function) build/temp.linux-ppc64-2.6/check_libyaml.c:8: warning: implicit declaration of function `yaml_parser_initialize' build/temp.linux-ppc64-2.6/check_libyaml.c:8: error: `parser' undeclared (first use in this function) build/temp.linux-ppc64-2.6/check_libyaml.c:9: warning: implicit declaration of function `yaml_parser_delete' build/temp.linux-ppc64-2.6/check_libyaml.c:11: warning: implicit declaration of function `yaml_emitter_initialize' build/temp.linux-ppc64-2.6/check_libyaml.c:11: error: `emitter' undeclared (first use in this function) build/temp.linux-ppc64-2.6/check_libyaml.c:12: warning: implicit declaration of function `yaml_emitter_delete' libyaml is not found or a compiler error: forcing --without-libyaml (if libyaml is installed correctly, you may need to specify the option --include-dirs or uncomment and modify the parameter include_dirs in setup.cfg) running install_lib creating /usr/local/lib/python2.6 error: could not create '/usr/local/lib/python2.6': Permission denied

    Read the article

  • Visual Studio: How to override the default "Build Action" for certain extension types per project or solution?

    - by Lukasz Podolak
    I'm serving my asp.net mvc views from many assemblies and copying views to the main application on post-build event. This works, however, I realized, that when I change something in view and just hit F5, changes are not included. What I have to do to see changes is to: save, build<- explicitly clicking, and then hit F5. However, it's pretty annoying solution. I discovered that setting Build action to "Embedded Resource" on view solves the problem as well, however other devs may not remember that they have to do this after adding every view to the solution. Is there a way to override the default build action for certain file extensions, such as: *.aspx, *.ascx in project or (better) in solution ? What I've found is an ability to add this setting globally, per machine, but I do not want to do that (link: http://blog.andreloker.de/post/2010/07/02/Visual-Studio-default-build-action-for-non-default-file-types.aspx) Any ideas ?

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >