Search Results

Search found 21343 results on 854 pages for 'pass by reference'.

Page 20/854 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • Complete your feedback to win a free registration to the PASS summit or an XBox

    - by simonsabin
    Don’t forget that if you complete you session and conference feedback for SQLBits then you will be entered into a draw for an XBox Super Elite . Not only that, we also have a registration for the PASS Summit in November this year to give away . The survey is essential for us to make SQLBits conference better . If you don’t tell us what doesn’t work then we can’t fix it. We listened this time and gave you better signage and more information in your agenda about sessions and the abstracts. So please...(read more)

    Read the article

  • Generic Handler vs Direct Reference

    - by JNF
    In a project where I'm working on the data access layer I'm trying to make a decision how to send data and objects to the next layer (and programmer). Is it better to tell him to reference my dll, OR should I build a generic handler and let him take the objects from there (i.e. json format) If I understand correctly, In case of 2. he would have to handle the objects on his own, whereas in case 1. he will have the entities I've built. Note: It is very probable that other people would need to take the same data, though, we're not up to that yet. Same question here - should I make it into a webservice, or have them access the handler?

    Read the article

  • Source cross reference like LXR but written in PHP?

    - by Dan
    Hi, does someone know a good cross reference engine like LXR, but written in PHP? My provider has PHP and MySQL as well as Postgres DBs, but neither I have access to Perl nor via SSH. I'd like to put up a online cross ref for my Software which is written in C. Thanks for helping! Dan

    Read the article

  • C++0x Overload on reference, versus sole pass-by-value + std::move?

    - by dean
    It seems the main advice concerning C++0x's rvalues is to add move constructors and move operators to your classes, until compilers default-implement them. But waiting is a losing strategy if you use VC10, because automatic generation probably won't be here until VC10 SP1, or in worst case, VC11. Likely, the wait for this will be measured in years. Here lies my problem. Writing all this duplicate code is not fun. And it's unpleasant to look at. But this is a burden well received, for those classes deemed slow. Not so for the hundreds, if not thousands, of smaller classes. ::sighs:: C++0x was supposed to let me write less code, not more! And then I had a thought. Shared by many, I would guess. Why not just pass everything by value? Won't std::move + copy elision make this nearly optimal? Example 1 - Typical Pre-0x constructor OurClass::OurClass(const SomeClass& obj) : obj(obj) {} SomeClass o; OurClass(o); // single copy OurClass(std::move(o)); // single copy OurClass(SomeClass()); // single copy Cons: A wasted copy for rvalues. Example 2 - Recommended C++0x? OurClass::OurClass(const SomeClass& obj) : obj(obj) {} OurClass::OurClass(SomeClass&& obj) : obj(std::move(obj)) {} SomeClass o; OurClass(o); // single copy OurClass(std::move(o)); // zero copies, one move OurClass(SomeClass()); // zero copies, one move Pros: Presumably the fastest. Cons: Lots of code! Example 3 - Pass-by-value + std::move OurClass::OurClass(SomeClass obj) : obj(std::move(obj)) {} SomeClass o; OurClass(o); // single copy, one move OurClass(std::move(o)); // zero copies, two moves OurClass(SomeClass()); // zero copies, one move Pros: No additional code. Cons: A wasted move in cases 1 & 2. Performance will suffer greatly if SomeClass has no move constructor. What do you think? Is this correct? Is the incurred move a generally acceptable loss when compared to the benefit of code reduction?

    Read the article

  • Are reference attributes destroyed when class is destroyed in C++?

    - by Genba
    Suppose I have a C++ class with an attribute that is a reference: class ClassB { ClassA &ref; public: ClassB(ClassA &_ref); } Of course, the constructor is defined this way: ClassB::ClassB(ClassA &_ref) : ref(_ref) { /* ... */ } My question is: When an instance of class 'ClassB' is destroyed, is the object referenced by 'ClassB::ref' also destroyed?

    Read the article

  • Does binding temporary to a reference require a copy constructor in C++?

    - by vitaut
    Consider the following code: class A { A(const A&); public: A() {} }; int main() { const A &a = A(); } This code compiles fine with GCC, but fails to compile with Visual C++ with the following error: test.cc(8) : error C2248: 'A::A' : cannot access private member declared in class 'A' test.cc(2) : see declaration of 'A::A' test.cc(1) : see declaration of 'A' So is it necessary to have a copy constructor accessible when binding a temporary to a reference?

    Read the article

  • Migration of VM from Hyper-V to Hyper-V R2 - Pass through disks

    - by Andrew Gillen
    I am trying to migrate a VM which is using two pass through disks from a legacy Hyper-V Cluster to a new R2 cluster. The migrated VM cannot use the pass through disks though. The guest OS (2008 R2) doesn't seem to like the disk and eventually tries to format the disk instead of mounting it. The migration process I have been using for all my VMs is to export the VM to a new lun, then add that new lun to the new cluster, importing the vm off it in the hyper-v console, then making it highly available. I assumed I could do the same thing and just add the two pass through disks to the new cluster and then attach them inside Hyper-V. Is there a process I need to follow to migrate pass through disks that does not involve setting up new Luns and robocopying the data over?

    Read the article

  • Dynamically reference a Named Table Column via cell content in Excel

    - by rcphq
    How do I reference an Excel Table column dynamically in Excel 2007? ie: i wanna reference a named column of a named table and what table it is will vary with the value of a cell. I have a Table in Excel (Let's call it Table1). I want to reference one of its columns (Let's call it column1) dynamically from a value in another cell (A1) so that I can achieve the following result: When I change A1, the formula that counts Table1[DynamicallyReferencedColumnName] gets updated to the new reference. I tried using =Count(Table1[INDIRECT("$A$1")]) but Excel says the formula contains an error. Example: A1 = names then the formula would equal Count(Table1[names]). A1 = lastname then the formula would equal Count(Table1[lastname]).

    Read the article

  • ILMerge - Unresolved assembly reference not allowed: System.Core

    - by Steve Michelotti
    ILMerge is a utility which allows you the merge multiple .NET assemblies into a single binary assembly more for convenient distribution. Recently we ran into problems when attempting to use ILMerge on a .NET 4 project. We received the error message: An exception occurred during merging: Unresolved assembly reference not allowed: System.Core.     at System.Compiler.Ir2md.GetAssemblyRefIndex(AssemblyNode assembly)     at System.Compiler.Ir2md.GetTypeRefIndex(TypeNode type)     at System.Compiler.Ir2md.VisitReferencedType(TypeNode type)     at System.Compiler.Ir2md.GetMemberRefIndex(Member m)     at System.Compiler.Ir2md.PopulateCustomAttributeTable()     at System.Compiler.Ir2md.SetupMetadataWriter(String debugSymbolsLocation)     at System.Compiler.Ir2md.WritePE(Module module, String debugSymbolsLocation, BinaryWriter writer)     at System.Compiler.Writer.WritePE(String location, Boolean writeDebugSymbols, Module module, Boolean delaySign, String keyFileName, String keyName)     at System.Compiler.Writer.WritePE(CompilerParameters compilerParameters, Module module)     at ILMerging.ILMerge.Merge()     at ILMerging.ILMerge.Main(String[] args) It turns out that this issue is caused by ILMerge.exe not being able to find the .NET 4 framework by default. The answer was ultimately found here. You either have to use the /lib option to point to your .NET 4 framework directory (e.g., “C:\Windows\Microsoft.NET\Framework\v4.0.30319” or “C:\Windows\Microsoft.NET\Framework64\v4.0.30319”) or just use an ILMerge.exe.config file that looks like this: 1: <configuration> 2: <startup useLegacyV2RuntimeActivationPolicy="true"> 3: <requiredRuntime safemode="true" imageVersion="v4.0.30319" version="v4.0.30319"/> 4: </startup> 5: </configuration> This was able to successfully resolve my issue.

    Read the article

  • SQLAuthority News – 18 Seconds of Fame – My PASS Experience

    - by pinaldave
    Happy Holidays to All of YOU! Life is full of little and happy surprises. I think Christmas and Santa are based on it. I just received very interesting email earlier today, I had no idea about it. Earlier this year, I had visited Seattle to attend SQLPASS – read the complete summary over here: SQLAuthority News – SQLPASS Nov 8-11, 2010-Seattle – An Alternative Look at Experience. While I was walking down, someone has stopped me and asked if they can talk to me for 15 seconds, I said yes and they had shot quick movie with mobile. The conversation was very quick and I had forgotten about it. Today I received email from one of the blog reader about it being on YouTube. Honestly, I did not know if this was ever going to be on YouTube. I am surprised and thrilled. Watch my 18 seconds fame movie. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: About Me, Pinal Dave, SQL, SQL Authority, SQL Optimization, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, T SQL, Technology

    Read the article

  • Pass Extra Parameters to JavaScript Callback Function

    - by BRADINO
    Here is a simple example of a function that takes a callback function as a parameter. query.send(handleQueryResponse); function handleQueryResponse(response){      alert('Processing...'); } If you wanted to pass extra variables to the callback function, you can do it like this. var param1 = 'something'; var param2 ='something else'; query.send(function(response) { handleQueryResponse(response, param1, param2) }); function handleQueryResponse(response,param1,param2){      alert('Processing...');      alert(param1);      alert(param2); }

    Read the article

  • PASS 13 Dispatches: Memory Optimized = On

    - by Tony Davis
    I'm at the PASS Summit in Charlotte for the Day 1 keynote by Quentin Clarke, Corporate VP of the data platform group at Microsoft. He's talking about how SQL Server 2014 is “pushing boundaries” and first up is SQL Server 2014's In-Memory OLTP technology (former codename “hekaton”) It is a feature that provokes a lot of interest and for good reason as, without any need for application rewrites or hardware updates, it can enable us to ensure that an application can find in memory most or all of the data it needs, and can lead to huge improvements in processing times. A good recent hekaton use cases article talks about applications that need a “Shock Absorber” when either spikes or just a high rate of incoming workload (including data in ETL scenarios) become a primary bottleneck. To get a really deep look at this technology, I would check out David DeWitt's summit keynote tomorrow (it will be live streamed). Other than that, to get started I'd recommend Kalen Delaney's whitepaper. She offers a lot of insight into how it works and how to start to define memory-optimized tables, and natively compiled stored procedures. These memory-optimized tables uses completely optimistic multi-version concurrency control – no waiting on locks! After that, Tom LaRock has compiled a useful set of links to drill deeper, and includes one to Microsoft's AMR tool to help you gauge the tables that might benefit most. Tony.

    Read the article

  • BizTalk Pipeline Component Error: "Object reference not set to an instance of an object"

    - by Stuart Brierley
    Yesterday I posted about my BizTalk Archiving Pipeline Component, which can be found on Codeplex if anyone is interested in taking a look. During testing of this component I began to encounter an error whereby the component would throw an "Object reference not set to an instance of an object" error when processing as a part of a Custom Pipeline. This was occurring when the component was reading a ReadOnlySeekableStream so that the data can be archived to file, but the actual code throwing the error was somewhere in the depths of the Microsoft.BizTalk.Streaming stack. It turns out that there is a known issue where this exception can be thrown because the garbage collector has disposed of of the stream before execution of the custom pipeline has completed. To get around this you need to add the streams in your code to the pipeline context resource tracker.   So a block of my code goes from:                         originalStrm = bodyPart.GetOriginalDataStream();                         if (!originalStrm.CanSeek)                         {                             ReadOnlySeekableStream seekableStream = new ReadOnlySeekableStream(originalStrm);                             inmsg.BodyPart.Data = seekableStream;                             originalStrm = inmsg.BodyPart.Data;                         }                         fileArchive = new FileStream(FullPath, FileMode.Create, FileAccess.Write);                         binWriter = new BinaryWriter(fileArchive);                         byte[] buffer = new byte[bufferSize];                         int sizeRead = 0;                         while ((sizeRead = originalStrm.Read(buffer, 0, bufferSize)) != 0)                         {                             binWriter.Write(buffer, 0, sizeRead);                         } to                         originalStrm = bodyPart.GetOriginalDataStream();                         if (!originalStrm.CanSeek)                         {                             ReadOnlySeekableStream seekableStream = new ReadOnlySeekableStream(originalStrm);                             inmsg.BodyPart.Data = seekableStream;                             originalStrm = inmsg.BodyPart.Data;                         }                         pc.ResourceTracker.AddResource(originalStrm);                         fileArchive = new FileStream(FullPath, FileMode.Create, FileAccess.Write);                         binWriter = new BinaryWriter(fileArchive);                         byte[] buffer = new byte[bufferSize];                         int sizeRead = 0;                         while ((sizeRead = originalStrm.Read(buffer, 0, bufferSize)) != 0)                         {                             binWriter.Write(buffer, 0, sizeRead);                         } So far this seems to have solved the issue, the error is no more, and my archive component is continuing its way through testing.

    Read the article

  • 24 Hours of PASS scheduling

    - by Rob Farley
    I have a new appreciation for Tom LaRock (@sqlrockstar), who is doing a tremendous job leading the organising committee for the 24 Hours of PASS event (Twitter: #24hop). We’ve just been going through the list of speakers and their preferences for time slots, and hopefully we’ve kept everyone fairly happy. All the submitted sessions (59 of them) were put up for a vote, and over a thousand of you picking your favourites. The top 28 sessions as voted were all included (24 sessions plus 4 reserves), and duplicates (when a single presenter had two sessions in the top 28) were swapped out for others. For example, both sessions submitted by Cindy Gross were in the top 28. These swaps were chosen by the committee to get a good balance of topics. Amazingly, some big names missed out, and even the top ten included some surprises. T-SQL, Indexes and Reporting featured well in the top ten, and in the end, the mix between BI, Dev and DBA ended up quite nicely too. The ten most voted-for sessions were (in order): Jennifer McCown - T-SQL Code Sins: The Worst Things We Do to Code and Why Michelle Ufford - Index Internals for Mere Mortals Audrey Hammonds - T-SQL Awesomeness: 3 Ways to Write Cool SQL Cindy Gross - SQL Server Performance Tools Jes Borland - Reporting Services 201: the Next Level Isabel de la Barra - SQL Server Performance Karen Lopez - Five Physical Database Design Blunders and How to Avoid Them Julie Smith - Cool Tricks to Pull From Your SSIS Hat Kim Tessereau - Indexes and Execution Plans Jen Stirrup - Dashboards Design and Practice using SSRS I think you’ll all agree this is shaping up to be an excellent event.

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >