Search Results

Search found 13925 results on 557 pages for 'ms office 2007'.

Page 325/557 | < Previous Page | 321 322 323 324 325 326 327 328 329 330 331 332  | Next Page >

  • Visual Studio 2008 Setup Project - Include .NET Framework 3.5

    - by rajadiga
    I built a Visual Studio 2008 setup Project wich depends on .NET 3.5. I added Prerequisites like: .NET 3.5, Microsoft office interoperability, VS tools for office System 3.0 Run time, .etc. After that Selected "Download Prerequisite from Same location as my application" in Specify install location for Prerequisite. I Built the setup and found mysetup.msi in Release directory. In a new machine I started fresh installation of my application. A dialog shows like this "This Setup Requires .NET framework 3.5 , Please install .NET setup then run this setup, .NET Framework can be obtained from web Do you want to do that now?" it gives "Yes" and "No" option - if I press yes it goes to Microsoft website. How can avoid it? I wanted setup take .NET Framework to be installed from same location where I put all setup files including mysetup.msi? In case of quiet installation cmd /c "msiexec /package mysetup.msi /quiet /log install.log" ..in log I can see only half way through installation then error Property(S): HideFatalErrorForm = TRUE MSI (s) (D0:24) [00:07:08:015]: Product: my product-- Installation failed. === Logging stopped: 3/23/2010 0:07:08 === How can complete I the installation without user intervention and without error using VS2008 setup project? Thanks for all the help in advance for any input.

    Read the article

  • Side by side madness - running binaries on different computer (with a twist)

    - by sbk
    Here's my configuration: Computer A - Windows 7, MS Visual Studio 2005 patched for Win7 compatibility (8.0.50727.867) Computer B - Windows XP SP2, MS Visual Studio 2005 installed (8.0.50727.42) My project has some external dependencies (prebuilt DLLs - either build on A or downloaded from the Internet), a couple of DLLs built from sources and one executable. I am mostly developing on A and all is fine there. At some point I try to build my project on computer B, copying the prebuilt DLLs to the output folder. Everything builds fine, but trying to start my application I get The application failed to initialize properly (0xc0150002).... The event log contains two of those: Dependent Assembly Microsoft.VC80.CRT could not be found and Last Error was The referenced assembly is not installed on your system. plus the slightly more amusing Generate Activation Context failed for some.dll. Reference error message: The operation completed successfully. At this point I'm trying my Google-Fu, but in vain - virtually all hits are about running binaries on machines without Visual Studio installed. In my case, however, the executables fail to run on the computer they are built. Next step was to try dependency walker and it baffled me even more - my DLLs built from sources on the same box cannot find MSVCR80.DLL and MSVCP80.DLL, however the executable seems to be alright in respect to those two DLLs i.e. when I open the executable with dependency walker it shows that the MSVC?80.DLLs can be found, but when I open one of my DLLs it says they cannot. That's where I am completely out of ideas what to do so I'm asking you, dear stackoverflow :) I admit I'm a bit blurry on the whole side-by-side thing, so general reading on the topic will also be appreciated.

    Read the article

  • PDO Database Connections Problem

    - by Metropolis
    Hey Everyone, Over a year ago I created my own database classes which use PDO, and handle all preparing, executing, and closing connections. These classes have been working great up until now. There are two different database severs I am grabbing from, MySQL, and MS SQL Express. I am retrieving an employee id from the MySQL server and using it to get that employees information from the MS SQL server. There are about 11k records coming from the MySQL server and my program is only making it through 1200 before crashing with an error like the following. Connection failed (odbc:Driver=FreeTDS;Servername=MSSQLExpress;Database=SMDINC) Class (PDOException) SQLSTATE[08001] SQLDriverConnect: 0 [unixODBC][FreeTDS][SQL Server]Unable to connect to data source It seems like the program is not able to connect to the data source, but it is running the exact same query about 30 times before this and having no problem. Also, I have thoroughly checked all of the data coming into the query and it all looks fine. I believe the issue may be that there are to many connections being created, but I have tried to close all connections in many different places, and nothing seems to be fixing the problem. Any debugging help, or suggestions would be appreciated! Craig Metrolis

    Read the article

  • Registry remotley hacked win 7 need help tracking the perp

    - by user577229
    I was writing some .VBS code at thhe office that would allow certain file extensions to be downloaded without a warning dialog on a w7x32 system. The system I was writing this on is in a lab on a segmented subnet. All web access is via a proxy server. The only means of accessing my machine is via the internet or from within the labs MSFT AD domain. While writing and testing my code I found a message of sorts. Upon refresing the registry to verify my code changed a dword, instead the message HELLO was written and visible in regedit where the dword value wass called for. I took a screen shot and proceeded to edit my code. This same weird behavior occurred last time I was writing registry code except on another internal server. I understand that remote registry access exists for windows systems. I will block this immediately once I return to the office. What I want to know is, can I trace who made this connection? How would I do this? I suspect the cause of this is the cause of other "odd" behaviors I'm experiencing at work such as losing control of my input director master control for over an hour and unchanged code that all of a sudden fails for no logical region. These failures occur at funny times, whenver I'm about to give a demonstration of my test code. I know this sounds crazy however knowledge of the registry component makes this believable. Once the registry can be accessed, the entire system is compromised. Any help or sanity checking is appreciated.

    Read the article

  • Help with MySQL Query using CASE statement

    - by hairdresser-101
    I am trying to group a number of customers together based on their "Head Office" or "Parent" location. THis works ok except for a flaw which I didn't forsee when I was developing my system... For customers that did not have a "Parent" (standalone business) I defaulted the parent_id to 0. Therefore, my data would look like this: id parent_id customer 1 0 CustName#1 2 4 CustName#2 - Melbourne 3 4 CustName#2 - Sydney 4 0 CustName#2 (Head Office) What I want to do is Group my results together so that I have one row for CustName#1 and one row for CustName#2 BUT my problem is that there is no parent record for parent_id=0 and these rows are being excluded when using an inner join. I've tried using a case statement but that is not working either (parents are still being ignored) Any help would be greatly appreciated. Here is my query (My CASE is basically trying to get the business_name from the customer table based on the parent_id EXCEPT when the parent_id = 0, THEN just use the customer_name that is listed in the job_summary table): SELECT js.month_of_year, (CASE js.parent_id WHEN 0 THEN js.customer_name ELSE c.business_name END) as customer, SUM(js.jobs), SUM(js.total_cost), sum(js.total_sell) FROM JOB_SUMMARY js INNER JOIN customer c on js.parent_id=c.id group by js.month_of_year, (CASE c.parent_id WHEN 0 THEN js.customer_name ELSE c.business_name END) ORDER BY `customer` ASC

    Read the article

  • Need advice on comparing the performance of 2 equivalent linq to sql queries

    - by uvita
    I am working on tool to optimize linq to sql queries. Basically it intercepts the linq execution pipeline and makes some optimizations like for example removing a redundant join from a query. Of course, there is an overhead in the execution time before the query gets executed in the dbms, but then, the query should be processed faster. I don't want to use a sql profiler because I know that the generated query will be perform better in the dbms than the original one, I am looking for a correct way of measuring the global time between the creation of the query in linq and the end of its execution. Currently, I am using the Stopwatch class and my code looks something like this: var sw = new Stopwatch(); sw.Start(); const int amount = 100; for (var i = 0; i < amount; i++) { ExecuteNonOptimizedQuery(); } sw.Stop(); Console.Writeline("Executing the query {2} times took: {0}ms. On average, each query took: {1}ms", sw.ElapsedMilliseconds, sw.ElapsedMilliseconds / amount, amount); Basically the ExecutenNonOptimizedQuery() method creates a new DataContext, creates a query and then iterates over the results. I did this for both versions of the query, the normal one and the optimized one. I took the idea from this post from Frans Bouma. Is there any other approach/considerations I should take? Thanks in advance!

    Read the article

  • Excel Automation - creating a new worksheet

    - by Andrew Shepherd
    I am attempting what seems like a simple task: using C# to create a new Excel document containing new worksheets. For some reason, I am getting a strange COM error (0x800A03EC) Has anyone managed to get this to work? Does anyone have suggestions as to how to troubleshoot this? I've isolated this into the minimum amount of code: using Microsoft.Office.Interop.Excel; using System.Diagnostics; namespace ExcelAutomation { public static class ExcelTests { public static void CreateWorksheet() { try { var app = new Microsoft.Office.Interop.Excel.Application(); app.Visible = true; var workBooks = app.Workbooks; var newWorkbook = app.Workbooks.Add(XlWBATemplate.xlWBATWorksheet); Worksheet existingWorksheet = (Worksheet)newWorkbook.Sheets[1]; Worksheet workSheet = (Worksheet)newWorkbook.Sheets.Add ( null, // before existingWorksheet, null, // 1, null //XlSheetType.xlWorksheet ); } catch (System.Runtime.InteropServices.COMException ex) { Trace.WriteLine(string.Format("Caught COMException. Message: \"{0}\"", ex.Message)); } } } } The output window now says: Caught COMException. Message: "Exception from HRESULT: 0x800A03EC"

    Read the article

  • Change the content type of a pop up window.

    - by Oscar Reyes
    This question brought a new one: I have a html page and I need it to change the content type when the user press "save" button so the browser prompt to save the file to disk I've been doing this in the server side to offer "excel" versions of the page ( which is basically a html table ) <c:if test="${page.asExcelAction}"> <% response.setContentType("application/vnd.ms-excel"); %> What I'm trying to do now is to do the same, but in the client side with javacript but I can't manage to do it so. This is what I've got so far: <html> <head> <script> function saveAs(){ var sMarkup = document.getElementById('content').innerHTML; //var oNewDoc = document.open('application/vnd.ms-excel'); var oNewDoc = document.open('text/html'); oNewDoc.write( sMarkup ); oNewDoc.close(); } </script> </head> <body> <div id='content'> <table> <tr> <td>Stack</td> <td>Overflow</td> </tr> </table> </div> <input type="button" value="Save as" onClick="saveAs()"/> </body> </html>

    Read the article

  • MsSql Server high Resource Waits and Head Blocker

    - by MartinHN
    Hi I have a MS SQL Server 2008 Standard installation running a database for a webshop. The current size of the database is 2.5 GB. Running on Windows 2008 Standard. Dual Intel Xeon X5355 @ 2.00 GHz. 4 GB RAM. When I open the Activity Monitor, I see that I have a Wait Time (ms/sec) of 5000 in the "Other" category. In the Processes list, all connections from the webshop, the Head Blocker value is 1. I see every day that when I try to access the website, it can take 20-30 secs before it even starts to "work". I know that it is not network latency. (I have a 301 redirect from the same server that is executed instantly). When the first request has been served, it seems as if it's not a sleep anymore and every subsequent request is served instantly with the speed of light. The problem was worse two weeks ago, until I changed every query to include WITH (NOLOCK). But I still experience the problem, and the Wait times in the Activity Monitor is about the same. The largest table (Images) has 32764 rows (448576 KB). Some tables exceed 300000 rows, thought they're much smaller in size than the Images table. I have the default clustered index for every primary key column, only. Any ideas?

    Read the article

  • IMAP protocol support in different email servers

    - by raticulin
    Having to interact with several different email servers via IMAP (using javamail), I have found that there is a very different level of support for IMAP features among them. The lack of support of some features has resulted in more developing time, more complicated code to deal with different support, worse perforamance due to not being able to SEARCH etc. So I would like to get some info on other servers and what level of support they provide. So far I have dealt with Lotus Domino and Novell GroupWise (and to a lesser extend Exchange 2003 and 2007). I am particularly interested in most used one in unix/linux (Courier, Cyrus, Dovecot, UW IMAP) and also Zimbra, but feel free to add any you know. Also welcomed info about online services like gmail. Features that I consider (comment if you are interested in others and I'll add them. Custom flags Searching Custom flags Searching arbitrary headers Partial fetching Proxy authentication And what I have found so far (correct if I am wrong anywhere): Lotus Domino Custom flags yes Searching Custom flags yes Searching arbitrary headers yes Partial fetching ? Proxy authentication sort of, you can give some user permissions to access other users mailboxes and he will see them under his '\Other Users' folder Novell GroupWise Custom flags No Searching Custom flags No Searching arbitrary headers No Partial fetching ? Proxy authentication yes, you can use what is called a Trusted Application

    Read the article

  • Sending an email with browser capabilities and screen size etc.

    - by talkingnews
    A lot of my visitors are blind (with it being a site for the blind), and often when trying to diagnose problems, I'd like to know what version of browser etc they're using, whether flash is installed. Because more often than not, someone will swear they are using X, when in fact Y is installed. Currently, I'm using http://jsbrwsniff.sourceforge.net/usage.html piped into an email, but I've got 2 problems here: First of all, jsbrwsniff is quite "heavy" and hasn't been updated since early 2007, so there's a lot of -1's in the result. Secondly, if I call it as follows, the page reloads: <a href="#" onclick="sendEmail()">Email feedback</a> And if I call it like this, the page goes blank and looks like it's trying to infinitely load a blank page: <a href="javascript:sendEmail()">Email feedback</a> See the nightmare for yourself here: http://kingston.talking-newspapers.co.uk/ Now, I know there are 1001 articles and comments here and elsewhere saying "don't use browser sniffers, they can be spoofed (etc)", but honestly, you'll have to trust me that this is a significantly useful tool when you're talking someone in their more "senior years" and using a screenreader through "help about", when they've clicked the wrong window to start with! I'm using jquery anyway in the site, and I'm aware of $jQuery.browser and $jQuery.support, but these don't tell me the elements I need (like whether Flash is installed, and what version etc). I've looked everywhere for a jquery plugin for my needs, with no luck. Finally, if I have to stick to the current method of jsbrwsniff then it's not the end of the world, but if anyone knows a way of launching the user's email client populated with the information I need but WITHOUT refreshing or blanking the page, I'd love to know. BTW - there's a good reason for not using a webform, which is simply because it's easier for the screen-reader user to use an email client they are used to. Thanks!

    Read the article

  • Convert asp.net webforms logic to asp.net MVC

    - by gmcalab
    I had this code in an old asp.net webforms app to take a MemoryStream and pass it as the Response showing a PDF as the response. I am now working with an asp.net MVC application and looking to do this this same thing, but how should I go about showing the MemoryStream as PDF using MVC? Here's my asp.net webforms code: private void ShowPDF(MemoryStream ms) { try { //get byte array of pdf in memory byte[] fileArray = ms.ToArray(); //send file to the user Page.Response.Cache.SetCacheability(HttpCacheability.NoCache); Page.Response.Buffer = true; Response.Clear(); Response.ClearContent(); Response.ClearHeaders(); Response.Charset = string.Empty; Response.ContentType = "application/pdf"; Response.AddHeader("content-length", fileArray.Length.ToString()); Response.AddHeader("Content-Disposition", "attachment;filename=TID.pdf;"); Response.BinaryWrite(fileArray); Response.Flush(); Response.Close(); } catch { // and boom goes the dynamite... } }

    Read the article

  • Which version of Grady Booch's OOA/D book should I buy?

    - by jackj
    Grady Booch's "Object-Oriented Analysis and Design with Applications" is available brand new in both the 2nd edition (1993) and the 3rd edition (2007), while many used copies of both editions are available. Here are my concerns: 1) The 2nd edition uses C++: given that I just finished reading my first two C++ books (Accelerated C++ and C++ Primer) I guess practical tips can only help, so the 2nd edition is probably best (I think the 3rd edition has absolutely no code). On the other hand, the C++ books I read insist on the importance of using standard C++, whereas Booch's 2nd edition was published before the 1998 standard. 2) The 2nd edition is shorter (608 pages vs. 720) so, I guess, it will be slightly easier to get through. 3) The 3rd edition uses UML 2.0, whereas the 2nd edition is pre-UML. Some reviews say that the notation in the 2nd edition is close enough to UML, so it doesn't matter, but I don't know if I should be worrying about this or not. 4) The 2nd edition is available in good-shape used copies for considerably less than what the 3rd one goes for. Given all the above factors, do you think I should buy the 2nd or the 3rd edition? Recommendations on other books are also welcome but I would prefer it if whoever answers has read at least one of the versions of Booch's book (preferably both!). I have already bought but not read GoF and Riel's books. I also know that I should practice a lot with real-life code. Thanks.

    Read the article

  • ant ftp task "Could not date test remote file"

    - by avok00
    Hi guys! I am using Ant ftp task to deploy my project files to a remote app server. Ant is not able to detect the date of the remote file and it re-uploads all files every time. When I start Ant in debug mode it says: [ftp] checking date for mailer.war [ftp] Could not date test remote file: mailer.war assuming out of date. The remote server is MS FTP (Windows Vista version) Ant version is 1.8.2; I use commons-net-2.2 and jakarta-oro-2.0.8 (could not find newer version) My ant task looks like this <!-- Deploy new and changed files --> <target name="deploy" depends="package" description="Deploy new and changed files"> <ftp server="localhost" userid="" password="" action="send" depends="yes" passive="true" systemTypeKey="WINDOWS" serverTimeZoneConfig="Europe/Sofia" defaultDateFormatConfig="MMM dd yyyy" recentDateFormatConfig="MMM dd HH:mm" binary="true" retriesAllowed="3" verbose="true"> <fileset dir="${webapp.artefacts.path}"/> </ftp> </target> I read an article here: Ant:The definitive guide that says I need a version of jakarta oro AFTER 2.0.8 to talk to MS FTP servers, but I was not able to find such version. Jakarta oro site - http://jakarta.apache.org/oro/ says the oro project is retired as of 2010, but their latest distribution is from 2003! Please, can anyone help me with this? Any solution or any alternatives to the Ant ftp task? Thanks!

    Read the article

  • Why qry.post executed with asynchronous mode?

    - by Ryan
    Recently I met a strange problem, see code snips as below: var sqlCommand: string; connection: TADOConnection; qry: TADOQuery; begin connection := TADOConnection.Create(nil); try connection.ConnectionString := 'Provider=Microsoft.Jet.OLEDB.4.0;Data Source=Test.MDB;Persist Security Info=False'; connection.Open(); qry := TADOQuery.Create(nil); try qry.Connection := connection; qry.SQL.Text := 'Select * from aaa'; qry.Open; qry.Append; qry.FieldByName('TestField1').AsString := 'test'; qry.Post; beep; finally qry.Free; end; finally connection.Free; end; end; First, Create a new access database named test.mdb and put it under the directory of this test project, we can create a new table named aaa in it which has only one text type field named TestField1. We set a breakpoint at line of "beep", then lunch the test application under ide debug mode, when ide stops at the breakpoint line (qry.post has been executed), at this time we use microsoft access to open test.mdb and open table aaa you will find there are no any changes in table aaa, if you let the ide continue running after pressing f9 you can find a new record is inserted in to table aaa, but if you press ctrl+f2 to terminate the application at the breakpoint, you will find the table aaa has no record been inserted, but in normal circumstance, a new record should be inserted in to the table aaa after qry.post executed. who can explain this problem , it troubles me so long time. thanks !!! BTW, the ide is delphi 2010, and the access mdb file is created by microsoft access 2007 under windows 7

    Read the article

  • Ultra-Portable Laptop or Tablet PC for Development and Sketching

    - by Nelson LaQuet
    I am a software developer that primarily writes in PHP, [X]HTML, CSS, Javascript, C# and C++. I use Eclipse for web development, Visual Studio 2008 for C++ and C# work, TortoiseSVN, Subversion server for local repositories, SQL Server Express, Apache and MYSQL. I also use Office 2007 for word processing and spreadsheets and use Vista Ultimate 64 as my primary operating system. The only other things I do on my laptop are watch movies, surf the internet and listen to music. I currently have a Acer Aspire 5100 (1.4 GHz AMD Turion X2, 2 GB of RAM and a 15.4" screen). This thing does not cut it in performance or portability, and in addition, my DVD drive failed. And before anybody posts about vista: I have had XP Professional 32 on it for the last two years, and recently upgraded to Vista 64. It is actually faster (with areo disabled) then XP; so it is not the OS that is causing the laptop to be slow. I usually sketch a lot, for explaining things, developing user interfaces and software architecture. Because of my requirements, I was thinking about a Lenovo X61 Tablet PC. It outperforms my current laptop, is significantly more portable, and... is a tablet. My question is: do any other software developers use this (or other tablets) for programming? Does it help to be able to sketch on the computer itself? And is it capable of being a good development machine? Will it handle the above software listed? If not, what is the best ultra-portable laptop that is good for programming? Or are ultra-portable laptops even good for programming? I could manage with my 15.4" screen, but am spoiled by my two 19" at my home desktop and my job's workstation.

    Read the article

  • Ant fails without message at javac

    - by digitala
    I've written an Ant build.xml file which obtains a number of source files via WSDL and compiles them. These have been working on an old, now destroyed (and therefore unavailable for comparison), system but the build process isn't completing on this newer, faster system. The relevant section of the build file looks like this: <target name="compile" depends="init"> <java classname="org.apache.axis.wsdl.WSDL2Java"> <arg line="--all --server-side --skeletonDeploy --factory --wrapArrays --output src ${srcurl}" /> </java> <javac srcdir="${src}" destdir="${build}" verbose="yes" /> </target> The files are downloaded via the WSDL service successfully, however after that point Ant simply stops & returns to the commandline. Versions of the relevant apps: # java -version java version "1.6.0_14" Java(TM) SE Runtime Environment (build 1.6.0_14-b08) Java HotSpot(TM) 64-Bit Server VM (build 14.0-b16, mixed mode) # javac -version javac 1.6.0_14 # ant -version Apache Ant version 1.6.5 compiled on January 6 2007 I'm assuming that there's a problem with javac that Ant isn't passing back. Is there any way I can get some debugging information from javac? I've tried adding a <record /> tag to the target but that doesn't give any more information than running ant -v does. Any other suggestions would be great, also!

    Read the article

  • High Runtime for Dictionary.Add for a large amount of items

    - by aaginor
    Hi folks, I have a C#-Application that stores data from a TextFile in a Dictionary-Object. The amount of data to be stored can be rather large, so it takes a lot of time inserting the entries. With many items in the Dictionary it gets even worse, because of the resizing of internal array, that stores the data for the Dictionary. So I initialized the Dictionary with the amount of items that will be added, but this has no impact on speed. Here is my function: private Dictionary<IdPair, Edge> AddEdgesToExistingNodes(HashSet<NodeConnection> connections) { Dictionary<IdPair, Edge> resultSet = new Dictionary<IdPair, Edge>(connections.Count); foreach (NodeConnection con in connections) { ... resultSet.Add(nodeIdPair, newEdge); } return resultSet; } In my tests, I insert ~300k items. I checked the running time with ANTS Performance Profiler and found, that the Average time for resultSet.Add(...) doesn't change when I initialize the Dictionary with the needed size. It is the same as when I initialize the Dictionary with new Dictionary(); (about 0.256 ms on average for each Add). This is definitely caused by the amount of data in the Dictionary (ALTHOUGH I initialized it with the desired size). For the first 20k items, the average time for Add is 0.03 ms for each item. Any idea, how to make the add-operation faster? Thanks in advance, Frank

    Read the article

  • Microsoft Training in .NET

    - by JD
    Hey guys, A quick question about training. My company are offering to put me through a 5 day course in .NET programming with C#, on the proviso that I work for them for a minimum 12 month period immediately after the training has been completed. IF I don't work for them for 12months I will be expected to pay them back teh cost of the training, minus the amount of months which I have stayed at the company (e.g. stay 6 months and only pay back half). The course will be $3,300 AUD - and will have no qualifications. I have been programming in C# for about 6 months, and in PHP for about 10 years - so I am getting the hang of .NET slowly but surely. Is it worth doing this? Or would I be best advised to learn from reading the MS books from Amazon for $40 and then sitting the 2 exams to get MS certified off my own back? The company are not willing to pay for my certification as well as 'they are not training me to get certified' and hence more employable... Thoughts? How do other companies deal with training? And stop people from taking off after becoming certified?

    Read the article

  • Mysql return value as 0 in the fetch result.

    - by Karthik
    I have this two tables, -- -- Table structure for table `t1` -- CREATE TABLE `t1` ( `pid` varchar(20) collate latin1_general_ci NOT NULL, `pname` varchar(20) collate latin1_general_ci NOT NULL ) ENGINE=MyISAM DEFAULT CHARSET=latin1 COLLATE=latin1_general_ci; -- -- Dumping data for table `t1` -- INSERT INTO `t1` VALUES ('p1', 'pro1'); INSERT INTO `t1` VALUES ('p2', 'pro2'); -- -------------------------------------------------------- -- -- Table structure for table `t2` -- CREATE TABLE `t2` ( `pid` varchar(20) collate latin1_general_ci NOT NULL, `year` int(6) NOT NULL, `price` int(3) NOT NULL ) ENGINE=MyISAM DEFAULT CHARSET=latin1 COLLATE=latin1_general_ci; -- -- Dumping data for table `t2` -- INSERT INTO `t2` VALUES ('p1', 2009, 50); INSERT INTO `t2` VALUES ('p1', 2010, 60); INSERT INTO `t2` VALUES ('p3', 2007, 200); INSERT INTO `t2` VALUES ('p4', 2008, 501); my query is, SELECT * FROM `t1` LEFT JOIN `t2` ON t1.pid = t2.pid Getting the result, pid pname pid year price p1 pro1 p1 2009 50 p1 pro1 p1 2010 60 p2 pro2 NULL NULL NULL My question is, i want to get the price value is 0 instead of NULL. How can i write the query to getting the price value is 0. Thanks in advance for help.

    Read the article

  • Excel 2010 64 bit can't create .net object

    - by aboes81
    I have a simple class library that I use in Excel. Here is a simplification of my class... using System; using System.Runtime.InteropServices; namespace SimpleLibrary { [ComVisible(true)] public interface ISixGenerator { int Six(); } public class SixGenerator : ISixGenerator { public int Six() { return 6; } } } In Excel 2007 I would create a macro enabled workbook and add a module with the following code: Public Function GetSix() Dim lib As SimpleLibrary.SixGenerator lib = New SimpleLibrary.SixGenerator Six = lib.Six End Function Then in Excel I could call the function GetSix() and it would return six. This no longer works in Excel 2010 64bit. I get a Run-time error '429': ActiveX component can't create object. I tried changing the platform target to x64 instead of Any CPU but then my code wouldn't compile unless I unchecked the Register for COM interop option, doing so makes it so my macro enable workbook cannot see SimpleLibrary.dll as it is no longer regsitered. Any ideas how I can use my library with Excel 2010 64 bit?

    Read the article

  • Bespoke Development or Leverage SharePoint With Web Parts etc?

    - by Asim
    Hi all, We are currently in the process of drawing up a solution for an existing client, creating a number of eServices. The client currently have MOSS 2007. The proposed solution is to use MOSS as the launching pad for the eServices… The requirement involves drawing up several online forms which provide registration facilities as well as facilitating a workflow of some sort. I have been told that the proposed solution requires complex web forms. Most are complex forms with parent child details that have multiple windows. The proposed solution is to do some bespoke development, developing ASP .NET forms. These forms would be deployed under the _layouts folder of the current MOSS portal, inheriting the master page design on the current site. I have been told that this approach make development and deployment more simple, as well has having ‘complete integration’ with MOSS. My questions are: Is this the best way to leverage SharePoint – it seems like the proposed solution is not leveraging MOSS at all..! I thought perhaps utilizing Web Parts would be better, but I have been told that this is more complex and developing more smarter intuitive UI is more difficult. Is this really the case? If not, what should be the recommended approach? We will be utilizing Ultimus as the workflow engine. However, I have been recommended K2 Workflows. Anyone used both/have any opinions on either? Many thanks in advance! Kind Regards,

    Read the article

  • Internet explorer 8 opens file in browser instead of the client

    - by Rogier
    Our company is working with a great Business Intelligence tool CorVu 4.2 to analyse the operational and strategic data. Since several years we are successfully working with Sharepoint 2007 to collaborate and share information with colleagues. Most of my colleagues are working with Internet Explorer 7, but step by step Internet Explorer 8 is implemented in the company. We share a lot of CorVu files thought Sharepoint, but since we are using Internet Explorer 8, we have a problem that is new for us. If we click on a CorVu file in Internet Explorer 8 (not necessarily in Sharepoint) a pop-up shows how to open the file, if we save the file, there is no problem. But if we open the file, the file is shown in the browser and not in the CorVu client! See the screenshot below: link (I removed some unnecessary information) So far my colleagues accept this 'feature' in Internet Explorer 8. But I we open and closes more CorVu files, multiple errors (more than 10) show up starting with: (unable to place more hyperlinks) By pressing Enter the errors disappear, but it's not professional! I contacted the creators of CorVu, but they don't have a solution for in their client. There may be a solution in Internet Explorer 8? The extensions of a CorVu file can be a .sqy, .tab or .qrp. But is it possible to force the files to open in the standard client instead of the browser?

    Read the article

  • C# Attribute XmlIgnore and XamlWriter class - XmlIgnore not working

    - by Horst Walter
    I have a class, containing a property Brush MyBrush marked as [XmlIgnore]. Nevertheless it is serialized in the stream causing trouble when trying to read via XamlReader. I did some tests, e.g. when changing the visibility (to internal) of the Property it is gone in the stream. Unfortunately I cannot do this in my particular scenario. Did anybody have the same issue and? Do you see any way to work around this? Remark: C# 4.0 as far I can tell This is a method from my Unit Test where I do test the XamlSerialization: // buffer to a StringBuilder StringBuilder sb = new StringBuilder(); XmlWriter writer = XmlWriter.Create(sb, settings); XamlDesignerSerializationManager manager = new XamlDesignerSerializationManager(writer) {XamlWriterMode = XamlWriterMode.Expression}; XamlWriter.Save(testObject, manager); xml = sb.ToString(); Assert.IsTrue(!String.IsNullOrEmpty(xml) && !String.IsNullOrEmpty(xml), "Xaml Serialization failed for " + testObject.GetType() + " no xml string available"); xml = sb.ToString(); MemoryStream ms = xml.StringToStream(); object root = XamlReader.Load(ms); Assert.IsTrue(root != null, "After reading from MemoryStream no result for Xaml Serialization"); In one of my classes I use the Property Brush. In the above code this Unit Tests fails because of a Brush object not serializable is the value. When I remove the Setter (as below, the Unit Test passes. Using the XmlWriter (basically same test as above) it works. In the StringBuffer sb I can see that Property Brush is serialized when the Setter is there and not when removed (most likely another check ignoring the Property because of no setter). Other Properties with [XmlIgnore] are ignored as intended. [XmlIgnore] public Brush MyBrush { get { ..... } // removed because of problem with Serialization // set { ... } }

    Read the article

  • Import-Pssession is not importing cmdlets when used in a custom module

    - by Douglas Plumley
    I have a PowerShell script/function that works great when I use it in my PowerShell profile or manually copy/paste the function in the PowerShell window. I'm trying to make the function accessible to other members of my team as a module. I want to have the module stored in a central place so we can all add it to our PSModulePath. Here is a copy of the basic function: Function Connect-O365{ $o365cred = Get-Credential [email protected] $session365 = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell/ -Credential $o365cred -Authentication Basic -AllowRedirection Import-PSSession $session365 -AllowClobber } If I save this function in my PowerShell profile it works fine. I can dot source a *.ps1 script with this function in it and it works as well. The issue is when I save the function as a *.psm1 PowerShell script module. The function runs fine but none of the exported commands from the Import-PSSession are available. I think this may have something to do with the module scope. I'm looking for suggestions on how to get around this. EDIT When I create the following module and run Connect-O365 the imported cmdlets will not be available. $scriptblock = { Function Connect-O365{ $o365cred = Get-Credential [email protected] $session365 = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell/" -Credential $o365cred -Authentication Basic -AllowRedirection Import-PSSession $session365 -AllowClobber } } New-Module -Name "Office 365" -ScriptBlock $scriptblock When I import the next module without the Connect-O365 function the imported cmdlets are available. $scriptblock = { $o365cred = Get-Credential [email protected] $session365 = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell/" -Credential $o365cred -Authentication Basic -AllowRedirection Import-PSSession $session365 -AllowClobber } New-Module -Name "Office 365" -ScriptBlock $scriptblock This appears to be a scope issue of some sort, just not sure how to get around it.

    Read the article

< Previous Page | 321 322 323 324 325 326 327 328 329 330 331 332  | Next Page >