Search Results

Search found 402 results on 17 pages for 'junk'.

Page 10/17 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Troubleshooting VC++ DLL in VB.Net

    - by Jolyon
    I'm trying to make a solution in Visual Studio that consists of a VC++ DLL and a VB.Net application. To figure this out, I created a VC++ Class Library project, with the following code (I removed all the junk the wizard creates): mathfuncs.cpp: #include "MathFuncs.h" namespace MathFuncs { double MyMathFuncs::Add(double a, double b) { return a + b; } } mathfuncs.h: using namespace System; namespace MathFuncs { public ref class MyMathFuncs { public: static double Add(double a, double b); }; } This compiles quite happily. I can then add a VC++ console project to the solution, add a reference to the original project for this new project, and call it as follows: test.cpp: using namespace System; int main(array<System::String ^> ^args) { double a = 7.4; int b = 99; Console::WriteLine("a + b = {0}", MathFuncs::MyMathFuncs::Add(a, b)); return 0; } This works just fine, and will build to test.exe and mathsfuncs.dll. However, I want to use a VB.Net project to call the DLL. To do this, I add a VB.Net project to the solution, make it the startup project, and add a reference to the original project. Then, I attempt to use it as follows: MsgBox(MathFuncs.MyMathFuncs.Add(1, 2)) However, when I run this code, it gives me an error: "Could not load file or assembly 'MathFuncsAssembly, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. An attempt was made to load a program with an incorrect format." Do I need to expose the method somehow? I'm using Visual Studio 2008 Professional.

    Read the article

  • Stencil mask with AlphaTestEffect

    - by Brendan Wanlass
    I am trying to pull off the following effect in XNA 4.0: http://eng.utah.edu/~brendanw/question.jpg The purple area has 50% opacity. I have gotten pretty close with the following code: public static DepthStencilState AlwaysStencilState = new DepthStencilState() { StencilEnable = true, StencilFunction = CompareFunction.Always, StencilPass = StencilOperation.Replace, ReferenceStencil = 1, DepthBufferEnable = false, }; public static DepthStencilState EqualStencilState = new DepthStencilState() { StencilEnable = true, StencilFunction = CompareFunction.Equal, StencilPass = StencilOperation.Keep, ReferenceStencil = 1, DepthBufferEnable = false, }; ... if (_alphaEffect == null) { _alphaEffect = new AlphaTestEffect(_spriteBatch.GraphicsDevice); _alphaEffect.AlphaFunction = CompareFunction.LessEqual; _alphaEffect.ReferenceAlpha = 129; Matrix projection = Matrix.CreateOrthographicOffCenter(0, _spriteBatch.GraphicsDevice.PresentationParameters.BackBufferWidth, _spriteBatch.GraphicsDevice.PresentationParameters.BackBufferHeight, 0, 0, 1); _alphaEffect.Projection = world.SystemManager.GetSystem<RenderSystem>().Camera.View * projection; } _mask = new RenderTarget2D(_spriteBatch.GraphicsDevice, _spriteBatch.GraphicsDevice.PresentationParameters.BackBufferWidth, _spriteBatch.GraphicsDevice.PresentationParameters.BackBufferHeight, false, SurfaceFormat.Color, DepthFormat.Depth24Stencil8); _spriteBatch.GraphicsDevice.SetRenderTarget(_mask); _spriteBatch.GraphicsDevice.Clear(ClearOptions.Target | ClearOptions.Stencil, Color.Transparent, 0, 0); _spriteBatch.Begin(SpriteSortMode.Immediate, null, null, AlwaysStencilState, null, _alphaEffect); _spriteBatch.Draw(sprite.Texture, position, sprite.SourceRectangle,Color.White, 0f, sprite.Origin, 1f, SpriteEffects.None, 0); _spriteBatch.End(); _spriteBatch.Begin(SpriteSortMode.Immediate, null, null, EqualStencilState, null, null); _spriteBatch.Draw(_maskTex, new Vector2(x * _maskTex.Width, y * _maskTex.Height), null, Color.White, 0f, Vector2.Zero, 1f, SpriteEffects.None, 0); _spriteBatch.End(); _spriteBatch.GraphicsDevice.SetRenderTarget(null); _spriteBatch.GraphicsDevice.Clear(Color.Black); _spriteBatch.Begin(); _spriteBatch.Draw((Texture2D)_mask, Vector2.Zero, null, Color.White, 0f, Vector2.Zero, 1f, SpriteEffects.None, layer/max_layer); _spriteBatch.End(); My problem is, I can't get the AlphaTestEffect to behave. I can either mask over the semi-transparent purple junk and fill it in with the green design, or I can draw over the completely opaque grassy texture. How can I specify the exact opacity that needs to be replace with the green design?

    Read the article

  • How can I email a vCard to users who are unable to download it?

    - by Zachary Lewis
    I have created vCards for the people in my business, and they are great for users on standard browsers; however, users browsing on a mobile device (notably iPhone) are unable to download and view my vCard. Is there a service that I can direct them to that will allow them to receive an email containing my vCard, or is there a simple way I can set this up myself? I am running my site on WordPress, and initial attempts have failed spectacularly. I'd like for them to be given the option to perform either action, but have the predominant action more prominently visible (probably via user agent detection). Something along the lines of: It looks like you're on an iPhone! It's a bummer they can't download vCards, but if you enter your email address, we'll wrap one up and send it your way! Don't worry, we won't send you junk email. Heck, we don't even save your email address! [email protected] Think you've got it all figured out? Fine, download the vCard instead! If you know of a service or simple-to-implement PHP library (or WordPress plug-in), please let me know! If not, let me know what the best solution to this problem is!

    Read the article

  • Need to sanity-check my .htaccess, especially Limit GET POST line for Google repellent

    - by jose
    I need a sanity check on this .htaccess (from a WordPress site) I inherited from a 5 month+ old site. What's the symptom? Google + Bing crawl, but don't index any of the pages. Let me be clear: I'm not mad about "not ranking high." I think something is (accidentally) rejecting search engine indexing. I am not an expert on .htaccess, but one part especially looked funny, the Limit GET POST line. Is it not weird to have both Allow and Deny all, with no parameters? Also, I've ruled out robots.txt, but if I were you I'd want to see it, so here it is: User-agent: * Crawl-delay: 30 And here's the more suspect .htaccess: # temp redirect wordpress content feeds to feedburner <IfModule mod_rewrite.c> RewriteEngine on RewriteCond %{HTTP_USER_AGENT} !FeedBurner [NC] RewriteCond %{HTTP_USER_AGENT} !FeedValidator [NC] RewriteRule ^feed/?([_0-9a-z-]+)?/?$ http://feeds.feedburner.com/anonymousblog [R=302,NC,L] </IfModule> # temp redirect wordpress comment feeds to feedburner <IfModule mod_rewrite.c> RewriteEngine on RewriteCond %{HTTP_USER_AGENT} !FeedBurner [NC] RewriteCond %{HTTP_USER_AGENT} !FeedValidator [NC] RewriteRule ^comments/feed/?([_0-9a-z-]+)?/?$ http://feeds.feedburner.com/anonymous_comments [R=302,NC,L] </IfModule> <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> IndexIgnore .htaccess */.??* *~ *# */HEADER* */README* */_vti* <Limit GET POST> order deny,allow deny from all allow from all </Limit> <Limit PUT DELETE> order deny,allow deny from all </Limit> php_value memory_limit 32M Adding header by request: <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> <meta name="robots" content="noindex,nofollow" /> <meta name="description" content="buncha junk i've deleted." /> <meta name="keywords" content="keywords i've deleted" /> <meta name="viewport" content="width=device-width" />

    Read the article

  • New Visual Studio 2010 Extension - Collapse Solution

    - by MikeParks
    If your team has recently upgraded to Visual Studio 2010, take a second to check out the new Extension Manager. You can use it to browse through or install tons of tools, controls, or templates from the Visual Studio Gallery. My friend, Cory Cissell, and I recently teamed up and created an extension of our own called "Collapse Solution". It adds an option called Collapse Solution to the context menu of the solution node in the solution explorer. It also adds an option called Collapse Project to the context menu of each project node in the solution explorer. When that option is clicked, it will walk through the solution explorer tree and collapse any expanded child nodes in that section (projects, folders, code behind files, designer files, etc.). I use to have an add-in that did this in Visual Studio 2008 but it wasn't compatible when we upgraded to 2010 so we decided to write our own. The old tool was also packaged with a bunch of other junk that we didn't need so we figured it would be a much cleaner tool if it was broken off into its own extension. There's no need to install extra tools if you don't really need them. So if you have upgraded to Visual Studio 2010, please feel free to try out our Collapse Solution extension and leave us a rating/review in the Visual Studio Gallery. Thanks! Here's the link: http://visualstudiogallery.msdn.microsoft.com/en-us/2d81fec6-71f3-4fa5-87b4-c2aa18e42f92

    Read the article

  • SQL Rally Pre-Con: Data Warehouse Modeling – Making the Right Choices

    - by Davide Mauri
    As you may have already learned from my old post or Adam’s or Kalen’s posts, there will be two SQL Rally in North Europe. In the Stockholm SQL Rally, with my friend Thomas Kejser, I’ll be delivering a pre-con on Data Warehouse Modeling: Data warehouses play a central role in any BI solution. It's the back end upon which everything in years to come will be created. For this reason, it must be rock solid and yet flexible at the same time. To develop such a data warehouse, you must have a clear idea of its architecture, a thorough understanding of the concepts of Measures and Dimensions, and a proven engineered way to build it so that quality and stability can go hand-in-hand with cost reduction and scalability. In this workshop, Thomas Kejser and Davide Mauri will share all the information they learned since they started working with data warehouses, giving you the guidance and tips you need to start your BI project in the best way possible?avoiding errors, making implementation effective and efficient, paving the way for a winning Agile approach, and helping you define how your team should work so that your BI solution will stand the test of time. You'll learn: Data warehouse architecture and justification Agile methodology Dimensional modeling, including Kimball vs. Inmon, SCD1/SCD2/SCD3, Junk and Degenerate Dimensions, and Huge Dimensions Best practices, naming conventions, and lessons learned Loading the data warehouse, including loading Dimensions, loading Facts (Full Load, Incremental Load, Partitioned Load) Data warehouses and Big Data (Hadoop) Unit testing Tracking historical changes and managing large sizes With all the Self-Service BI hype, Data Warehouse is become more and more central every day, since if everyone will be able to analyze data using self-service tools, it’s better for him/her to rely on correct, uniform and coherent data. Already 50 people registered from the workshop and seats are limited so don’t miss this unique opportunity to attend to this workshop that is really a unique combination of years and years of experience! http://www.sqlpass.org/sqlrally/2013/nordic/Agenda/PreconferenceSeminars.aspx See you there!

    Read the article

  • Using OData to get Mix10 files

    - by Jon Dalberg
    There has been a lot of talk around OData lately (go to odata.org for more information) and I wanted to get all the videos from Mix ‘10: two great tastes that taste great together. Luckily, Mix has exposed the ‘10 sessions via OData at http://api.visitmix.com/OData.svc, now all I have to do is slap together a bit of code to fetch the videos. Step 1 (cut a hole in the box) Create a new console application and add a new service reference. Step 2 (put your junk in the box) Write a smidgen of code: 1: static void Main(string[] args) 2: { 3: var mix = new Mix.EventEntities(new Uri("http://api.visitmix.com/OData.svc")); 4:   5: var files = from f in mix.Files 6: where f.TypeName == "WMV" 7: select f; 8:   9: var web = new WebClient(); 10: 11: var myVideos = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyVideos), "Mix10"); 12:   13: Directory.CreateDirectory(myVideos); 14:   15: files.ToList().ForEach(f => { 16: var fileName = new Uri(f.Url).Segments.Last(); 17: Console.WriteLine(f.Url); 18: web.DownloadFile(f.Url, Path.Combine(myVideos, fileName)); 19: }); 20: } Step 3 (have her open the box) Compile and run. As you can see, the client reference created for the OData service handles almost everything for me. Yeah, I know there is some batch file to download the files, but it relies on cUrl being on the machine – and I wanted an excuse to work with an OData service. Enjoy!

    Read the article

  • Strange robots.txt - how and why did it get there?

    - by Mick
    I recently created a very simple, pure HTML website which I have hosted with "hostmonster". Hostmonster had very good reviews on some comparison website and in general so far they appear to be perfectly good in every way... At least I thought so until just now... I have been making lots of edits to my site on an almost daily basis. My site now appears on the first page (7th on the list) for my most important keyphrase when doing a google search. But I did notice some problem with the snippet chosen by google. I asked a question on this site about snippets and got some great answers. I then made some modifications to my meta data and within 48hrs the google snippet for my search was perfect. The odd thing though was that looking at the "cached" version google had, it appeared that the cache was still very odl- like three weeks previous. This seemed very odd - how could it be that the google robots had read my new metadata without updating the cache? This puzzled me greatly. Just now it occurred to me that maybe I had some goofey setting in my robots.txt file. I didn't actually remember even making one - but I thought I'd have a look just in case. Much to my horror, I saw that there was a robots.txt and it contained the disturbing text below: sitemap: http://cdn.attracta.com/sitemap/728687.xml.gz Intuitively this looks like some kind of junk, spam trick, and I had indeed been getting some spam from "attracta". So my questions are: 1. Should I simply delete this robots.txt? 2. Was the file there all along - placed there because of some commercial tie-in between attracta and hostmonster. 3. Does the attracta robots file explain the lack of re-caching?

    Read the article

  • Why is apt-get --auto-remove not removing all dependencies?

    - by Mike
    I just installed a package (dansguardian in this case) and apt told me that I had unmet dependencies. # sudo apt-get install dansguardian Reading package lists... Done Building dependency tree Reading state information... Done The following extra packages will be installed: clamav clamav-base clamav-freshclam libclamav6 libtommath0 Suggested packages: clamav-docs squid libclamunrar6 The following NEW packages will be installed: clamav clamav-base clamav-freshclam dansguardian libclamav6 libtommath0 0 upgraded, 6 newly installed, 0 to remove and 0 not upgraded. Need to get 0 B/4,956 kB of archives. After this operation, 14.4 MB of additional disk space will be used. Do you want to continue [Y/n]? So I installed it and the dependencies. So far so good. Later on, I decide that this package just isn't the package for me, so I want to remove it and all of the other junk it installed with it since I'm not going to be needing any of it: # sudo apt-get remove --auto-remove --purge dansguardian Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be REMOVED: dansguardian 0 upgraded, 0 newly installed, 1 to remove and 0 not upgraded. After this operation, 1,816 kB disk space will be freed. Do you want to continue [Y/n]? However it is only removing that one specific package. What about clamav clamav-base clamav-freshclam libclamav6 libtommath0? Not only did it not remove them, but clamav was actually running a daemon that loads every time the computer boots. I thought that --auto-remove would remove not only the packages, but also the dependencies that were installed with it. So basically, without going through the apt history log file (if I even remember to do so, or if I even remember that a specific package I installed 3 months ago had dependencies along with it), is there a way to remove a package and all of the other dependencies that were installed like in this case?

    Read the article

  • How do you support your code post employment end?

    - by James
    What is the process for leaving a company (or even a group/division) in terms of code support? Is it best to handle all questions? Do you give the remaining developers access to yourself as a future resource? If so, is there a way to not give full access? I've experienced first hand where answers about the general software arthitecture from the initial developer would be invaluable. I understand that if serious assistance is needed, than it becomes a typical case of employment negotiation as a support contract. However, should serious assistance be required, what steps can you make to ease that process of contacting you? I was thinking of doing something like making a (YOUR_NAME)_codesupport @ (YOUR_FAVORITE_EMAIL_CLIENT).com address. My Situation Specifics: I'm a co-op student, and as such bounce around companies on 4-month stints. This means introducing myself to a lot of new code bases, as well as leaving a fair share of orphaned code behind when I leave a company. I feel bad if I leave junk code around.

    Read the article

  • How can I instruct nautilus to pre-generate PDF thumbnails?

    - by Glutanimate
    I have a large library of PDF documents (papers, lectures, handouts) that I want to be able to quickly navigate through. For that I need thumbnails. At the same time however, I see that the ~/.thumbnails folder is piling up with thumbs I don't really need. Deleting thumbnail junk without removing the important thumbs is impossible. If I were to delete them, I'd have to go to each and every folder with important PDF documents and let the thumbnail cache regenerate. I would love to able to automate this process. Is there any way I can tell nautilus to pre-cache the thumbs for a set of given directories? Note: I did find a set of bash scripts that appear to do this for pictures and videos, but not for any other documents. Maybe someone more experienced with scripting might be able to adjust these for PDF documents or at least point me in the right direction on what I'd have to modify for this to work with PDF documents as well. Edit: Unfortunately neither of the answers provided below work. See my comments below for more information. Is there anyone who can solve this?

    Read the article

  • Bad value for type timestamp on production server

    - by Juan Javaloyes
    I'm working with: seam 2.2.2 + hibernate + richfaces + jboss 5.1 + postgres I have an module which needs to load some data from the database. Easy. The problem is, on development it works fine, 100%, but when I deploy on my production server and try to get the data, an error rise: could not read column value from result set: fechahor9_504_; Bad value for type timestamp : [C@122e5cf SQL Error: 0, SQLState: 22007 Bad value for type timestamp : [C@122e5cf javax.persistence.PersistenceException: org.hibernate.exception.DataException: could not execute query [more errors] Caused by: org.postgresql.util.PSQLException: Bad value for type timestamp : [C@122e5cf at org.postgresql.jdbc2.TimestampUtils.loadCalendar(TimestampUtils.java:232) [more errors] Caused by: java.lang.NumberFormatException: Trailing junk on timestamp: '' at org.postgresql.jdbc2.TimestampUtils.loadCalendar(TimestampUtils.java:226) I can't understand why it works on my machine (development) and why not on production. Any clues? Anyone gone through the same problem? Is exactly the same compilation

    Read the article

  • Merge Cells Vertically in RTF

    - by Jimmy Johnson
    I need to programmatically generate an RTF document with a table that has a column vertically merged., e.x. ______________________________ | merged | foo | hello | | cell | | | | right |--------|----------| | here | bar | world | |_________|________|__________| I looked up online and found that the codes are \clvmgf and \clvmrg but I can't find a decent example. I made a text rtf using MS word, but there's too much junk rtf codes in it for me to figure it out where to put the \clvmgf and \clvmrg to get this to work. Could someone give me an rtf for above example table with no extraneous rtf codes so I can figure out how \clvmgf and \clvmrg works? Any additional explanation would also be greatly appreciated. Thanks!

    Read the article

  • Having the output of a Console App in Visual Studio instead of the Console

    - by devoured elysium
    When doing "Console" apps in Java with Eclipse, I see the output being put in a text box on the IDE itself, instead of having a Console popping up like in Visual Studio. This comes in handy, as even after the program has exited, I can still make good use of the text that was written in it, as it doesn't get erased until I run it again. Is it possible to achieve anything like that with Visual Studio? I know that instead of doing System.Console.WriteLine(str); I can do System.Diagnostics.Debug.WriteLine(str); but it is not quite the same thing, as you get a lot of "junk" in the Output window, as all the loaded symbols and such. Even better, is it possible to have everything done in the IDE itself, when you run your app, instead of having the Console running? Thanks

    Read the article

  • Where is the WPF Numeric UpDown control?

    - by AngryHacker
    Getting into the first serious WPF project. It seems like there are a lot of basic controls flat out missing. Specifically, I am looking for the Numeric UpDown control. Was there an out of band release that I missed? Really don't feel like writing my own control. I do not want to use the WindowsFormHost and plop a WinForm ctl on it. I want it to be fully WPF without any legacy junk. Thanks

    Read the article

  • Ruby on Rails simple_navigation Gem

    - by Paul
    I'm using the simple_navigation gem with RoR 2.3.5 It all seems to work correctly, I followed the info in the RDoc (seen here http://rdoc.info/projects/mexpolk/simple_navigation) However, when I actually render out the simple_navigation menu on my main application.html.erb file it escapes all of the html in it (multiple escapes actually). I end up with junk like this which in a browser ends up with all kinds of disjointed text and ["\ things everywhere. <ul class="simple_navigation" depth="0" id="simple_navigation_default"> ["<li class=\"menu\" drop_down=\"true\" id=\"simple_navigation_default_menus_home\"><a href=\"/home\">Wellcome</a><ul depth=\"1\" id=\"simple_navigation_default_menus_home_menus\"> [\"<li class=\\\"menu\\\" drop_down=\\\"false\\\" id=\\\"simple_navigation_default_menus_home_menus_settings\\\"><a href=\\\"/home/settings\\\">Appliction Settings</a></li>\"] </ul> </li>"] What am I doing wrong? Is there a way to tell Ruby on rails to NOT escape html?

    Read the article

  • Photoshop JSX script- CLOSE PHOTOSHOP!

    - by Geekay2
    How do I close photoshop using its javascript scripting language. (I am automaticly scripting a great deal of things, and I notice that for one reason or another, some of the ram is not releasing with each new task. My hopes are that after X ammount of operations, I will fully close photoshop, to free up the ram.. which it is eating up all of my 8 gigs, and after which then opens photoshop help and causes a huge failure (actually, to be honest it fills up my hard drive with junk till I get a "hard drive is full" message... (I think it is dumping the ram into virtual ram on my hard drive?)... what a mess)THANKS!!

    Read the article

  • How do I iterate over the properties of an anonymous object in C#?

    - by Tomas Lycken
    I want to take an anonymous object as argument to a method, and then iterate over its properties to add each property/value to a a dynamic ExpandoObject. So what I need is to go from new { Prop1 = "first value", Prop2 = SomeObjectInstance, Prop3 = 1234 } to knowing names and values of each property, and being able to add them to the ExpandoObject. How do I accomplish this? Side note: This will be done in many of my unit tests (I'm using it to refactor away a lot of junk in the setup), so performance is to some extent relevant. I don't know enough about reflection to say for sure, but from what I've understood it's pretty performance heavy, so if it's possible I'd rather avoid it... Follow-up question: As I said, I'm taking this anonymous object as an argument to a method. What datatype should I use in the method's signature? Will all properties be available if I use object?

    Read the article

  • Extracting pure content / text from HTML Pages by excluding navigation and chrome content

    - by Ankur Gupta
    Hi, I am crawling news websites and want to extract News Title, News Abstract (First Paragraph), etc I plugged into the webkit parser code to easily navigate webpage as a tree. To eliminate navigation and other non news content I take the text version of the article (minus the html tags, webkit provides api for the same). Then I run the diff algorithm comparing various article's text from same website this results in similar text being eliminated. This gives me content minus the common navigation content etc. Despite the above approach I am still getting quite some junk in my final text. This results in incorrect News Abstract being extracted. The error rate is 5 in 10 article i.e. 50%. Error as in Can you Suggest an alternative strategy for extraction of pure content, Would/Can learning Natural Language rocessing help in extracting correct abstract from these articles ? How would you approach the above problem ?. Are these any research papers on the same ?. Regards Ankur Gupta

    Read the article

  • Chinese encoding issue while listing files

    - by Null Pointer
    I am running a Java application on a Solaris10 with Chinese. Now there are some files in a directory with chinese filenames. When I do files = new File(dir).list() where "dir" is the parent directory containing that chinese file, I get the result filename files[0] as ?????(some junk characters). Now the deal is that my programs file.encoding property is already set to GBK and I also do Charset.isSupported("GBK") and it returns true too. So where could be the problem. I am running out of ideas. NOTE: I am not trying to print the filename anywhere or copy the file or something. I am simply openeing a stream to it, something like below: files = new File(dir).list(); new FileInputStream(files[0]); Now this gives me a FileNotFoundExcpetion, so I debug just to find that value inside files[0] is "??????".

    Read the article

  • What is Perl's equivalent to awk's /text/,/END/ ?

    - by kSiR
    I am looking to replace a nasty shell script that uses awk to trim down some HTML. The problem is I cannot find anything in Perl that does the aforementioned function awk '/<TABLE\ WIDTH\=\"100\%\" BORDER\=1\ CELLSPACING\=0><TR\ class\=\"tabhead\"><TH>State<\/TH>/,/END/' How can I do this in Perl? the expected output would be <TABLE WIDTH="100%" BORDER=1 CELLSPACING=0><TR class="tabhead"><TH>State</TH> The Perl flipflop operator gives me WAY more. (Everything between the asterisks is junk) *<h2>Browse Monitors (1 out of 497)</h2><br><font size="-1" style="font-weight:normal"> Use the <A HREF=/SiteScope/cgi/go.exe/SiteScope?page=monitorSummary&account=login15 >Monitor Description Report</a> to view current monitor configuration settings.</font>*<TABLE WIDTH="100%" BORDER=1 CELLSPACING=0><TR class="tabhead"><TH>State</TH>

    Read the article

  • jQuery ajax() function is ignoring dataType parameter in Firefox

    - by ccleve
    I'm trying to use jQuery.ajax() to fetch some html, but Firefox is giving me a "junk after document element" error message. As explained here and here the problem seems to be that Firefox is expecting XML from the server, and when it doesn't parse correctly it throws the error. Here's my ajax code: jQuery.ajax({ url: name, dataType: "html", success: function(result) { console.log(result); }, error: function (jqXHR, textStatus, errorThrown) { console.log(errorThrown); } }); The server returns the html with these response headers: Accept-Ranges bytes Content-Length 2957 Last-Modified Tue, 02 Jul 2013 16:16:59 GMT Note that there's no content-type header. I'm sure that adding one would solve the problem, but that's not an option. The real problem is that Firefox appears to be ignoring the dataType: parameter in the ajax call. I've also tried adding contentType: and accepts: parameters, but it doesn't help. What am I missing here? How do I force Firefox to process the response as plain text?

    Read the article

  • problem in opening a link to .rar file

    - by Kenneth
    Hi all, In my jsp, I have a hyperlink linking to a .rar file in the system. Somehow when I clicked on the link, it does not ask users to 'save file' or 'open file' nor using 7zip to open the .rar file. Instead, it just displays some junk characters on the web page. I thought it might be the problem with mime-mapping. So I put an mime-mapping to web.xml with mime-type=application/x-rar-compressed. But it still doesn't work. Do you have any idea what the problem is and how to solve it? Thanks a lot in advance. Other file types have no problem. Kenneth

    Read the article

  • Global Ignores for SVN?

    - by Michael Stum
    Is there a way to setup a global list of Ignores for a SVN Repository or for the SVN Client on the PC? The only reason I'm using tools like Tortoise/Ankh/VisualSVN is because I want to only check in the files I need without all the bin/obj/Resharper stuff. I'm spoiled by .gitignore and .hgignore which I just copy to a repository and then use "git commit -a" without having to care about checking in junk. I know I can manually set it, but that's tedious to do and I think it had to be applied to every new folder that gets created as well. Using SVN under Windows if that matters

    Read the article

  • Need advice on organizing two WPF applications within one Visual Studio solution

    - by Tim
    I have a WPF application (KaleidoscopeApplication) organized as follows: Solution (6 projects) Cryptography (DLL) Rfid (DLL) KaleidoscopeApplication (buildable "startup project") Basically, KaleidoscopeApplication contains a bunch of resources (sounds, images, etc) and your standard WPF junk (App.xaml, App.xaml.cs, other xaml and code). I need to create a new application that is very similar to Kaleidoscope, but I'm not sure of the best way to organize. This new app will need access to much of the same code and resources as Kaleidoscope. Preferably, I would like to create a new project in the solution, then simply use the "set as startup project" to pick which app I want to build. However, will I be able to access (share) the Resources folder of Kaleidoscope? I know I will be able to access much of the code if I simply add a reference to the project and include a "using Kaleidoscope". But the resources I'm not so sure about. Is this the right way to organize or am I asking for trouble in the future? Thanks in advance!

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >