Search Results

Search found 13807 results on 553 pages for 'undefined reference'.

Page 82/553 | < Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >

  • A file is being http posted, how can I reference the parameter by index?

    - by Blankman
    An XML file is being posted to a url that my spring mvc is responding to. In .NET, I could do this: request.Form[0] request.Form["abc"] or request.QueryString[0] request.QueryString["some_key"] Now with spring/servlets it seems I can only do this: request.getParameter("some_key") or get all the names or values. When someone is posting a file to a url, using http post, won't this be just a single request parameter then? Can I get the parameter using index with servlets?

    Read the article

  • Is there a way to get a reference to all the paragraphs or headings in a web page in JavaScript?

    - by mattbd
    I'm writing a simple Greasemonkey script to strip out all the images, headings and paragraphs from a web page (it's because someone wants to use images of several popular websites with the images and text removed in a presentation about branding of websites). I've figured out how to do so with images by using a for loop and the document.images array like this: var i = 0; var imglngth = document.images.length; for(i=0;i<imglngth;i++) { document.images[i].style.display="none"; } However, I don't believe there's an object representing paragraphs or headers. Any suggestions as to how I could implement this?

    Read the article

  • How to do a back-reference on Google AppEngine?

    - by jCuga
    I'm trying to access an object that is linked to by a db.ReferenceProperty in Google app engine. Here's the model's code: class InquiryQuestion(db.Model): inquiry_ref = db.ReferenceProperty(reference_class=GiftInquiry, required=True, collection_name="inquiry_ref") And I am trying to access it in the following way: linkedObject = question.inquiry_ref and then linkedKey = linkedObject.key but it's not working. Can anyone please help?

    Read the article

  • Reference properteries declared in a protocol and implemented in the anonymous category?

    - by Heath Borders
    I have the following protocol: @protocol MyProtocol @property (nonatomic, retain) NSObject *myProtocolProperty; -(void) myProtocolMethod; @end and I have the following class: @interface MyClass : NSObject { } @end I have a class extension declared, I have to redeclare my protocol properties here or else I can't implement them with the rest of my class. @interface()<MyProtocol> @property (nonatomic, retain) NSObject *myExtensionProperty; /* * This redeclaration is required or my @synthesize myProtocolProperty fails */ @property (nonatomic, retain) NSObject *myProtocolProperty; - (void) myExtensionMethod; @end @implementation MyClass @synthesize myProtocolProperty = _myProtocolProperty; @synthesize myExtensionProperty = _myExtensionProperty; - (void) myProtocolMethod { } - (void) myExtensionMethod { } @end In a consumer method, I can call my protocol methods and properties just fine. Calling my extension methods and properties produces a warning and an error respectively. - (void) consumeMyClassWithMyProtocol: (MyClass<MyProtocol> *) myClassWithMyProtocol { myClassWithMyProtocol.myProtocolProperty; // works, yay! [myClassWithMyProtocol myProtocolMethod]; // works, yay! myClassWithMyProtocol.myExtensionProperty; // compiler error, yay! [myClassWithMyProtocol myExtensionMethod]; // compiler warning, yay! } Is there any way I can avoid redeclaring the properties in MyProtocol within my class extension in order to implement MyProtocol privately?

    Read the article

  • MySQL datetime fields and daylight savings time -- how do I reference the "extra" hour?

    - by Aaron
    I'm using the America/New York timezone. In the Fall we "fall back" an hour -- effectively "gaining" one hour at 2am. At the transition point the following happens: it's 01:59:00 -04:00 then 1 minute later it becomes: 01:00:00 -05:00 So if you simply say "1:30am" it's ambiguous as to whether or not you're referring to the first time 1:30 rolls around or the second. I'm trying to save scheduling data to a MySQL database and can't determine how to save the times properly. Here's the problem: "2009-11-01 00:30:00" is stored internally as 2009-11-01 00:30:00 -04:00 "2009-11-01 01:30:00" is stored internally as 2009-11-01 01:30:00 -05:00 This is fine and fairly expected. But how do I save anything to 01:30:00 -04:00? The documentation does not show any support for specifying the offset and, accordingly, when I've tried specifying the offset it's been duly ignored. The only solutions I've thought of involve setting the server to a timezone that doesn't use daylight savings time and doing the necessary transformations in my scripts (I'm using PHP for this). But that doesn't seem like it should be necessary. Many thanks for any suggestions.

    Read the article

  • Is this a safe way to reference objects in JavaScript?

    - by John
    If I were to define two objects myDataStore and myDrawer something like this: var myDataStore = function(myObjectRef) { this.myInternalObject = myObjectRef; }; var myDrawer = function(myObjRef) { this.myInternalObject = myObjectRef; }; And if I were to create an object like so: (function(){ var myObject = window.myObject = function(){ this.dataStore = new myDataStore(this); this.drawer = new myDrawer(this); } })(); Then myObject.dataStore.myInternalObject, and myObject.drawer.myInternalObject, would simply be pointers back to the original 'myObject' - not taking up any additional memory in the browser. Yes? I am interested in implementing techniques like this - as it makes it easy for objects to communicate with each other.

    Read the article

  • Java: If I overwrite the .equals method, can I still test for reference equality with ==?

    - by shots fired
    I have the following situation: I need to sort trees based by height, so I made the Tree's comparable using the height attribute. However, I was also told to overwrite the equals and hashCode methods to avoid unpredictable behaviour. Still, sometimes I may want to compare the references of the roots or something along those lines using ==. Is that still possible or does the == comparison call the equals method?

    Read the article

  • How do I reference a control in a different thread?

    - by Testifier
    //button is clicked //worker starts private void processWorker_DoWork(object sender, DoWorkEventArgs e) { string code = DoLongWorkAndReturnCode(); if (code != 0) { MessageBox.Show("Error!"); EnableAllButtons(); // this is defined in the other thread and it's where i run into the error. } else { string code = DoAnotherLongProcessAndReturnCode(); if (code != 0) { MessageBox.Show("Error 2!"); EnableAllButtons(); // again, this is defined in the other thread } } } I'm running into a cross threading error because "EnableAllButtons()" is defined in a different thread. How do I go about enabling all buttons in one thread, from a different thread?

    Read the article

  • Trouble linking libboost libraries to compile sslsniff on RHEL

    - by rwong48
    Trying to build sslsniff on a RHEL 5.2 system here. When compiling sslsniff on RHEL I hit the same errors when using libboost packages (from repositories like rpmforge) and compiling libboost from source (which appeared to be successful.) I tried this on a fresh system as well (no previous/failed/garbage installs of libboost etc.) # make g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT SSLConnectionManager.o -MD -MP -MF .deps/SSLConnectionManager.Tpo -c -o SSLConnectionManager.o SSLConnectionManager.cpp mv -f .deps/SSLConnectionManager.Tpo .deps/SSLConnectionManager.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT FirefoxUpdater.o -MD -MP -MF .deps/FirefoxUpdater.Tpo -c -o FirefoxUpdater.o FirefoxUpdater.cpp mv -f .deps/FirefoxUpdater.Tpo .deps/FirefoxUpdater.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT Logger.o -MD -MP -MF .deps/Logger.Tpo -c -o Logger.o Logger.cpp mv -f .deps/Logger.Tpo .deps/Logger.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT SessionCache.o -MD -MP -MF .deps/SessionCache.Tpo -c -o SessionCache.o SessionCache.cpp mv -f .deps/SessionCache.Tpo .deps/SessionCache.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT SSLBridge.o -MD -MP -MF .deps/SSLBridge.Tpo -c -o SSLBridge.o SSLBridge.cpp mv -f .deps/SSLBridge.Tpo .deps/SSLBridge.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT HTTPSBridge.o -MD -MP -MF .deps/HTTPSBridge.Tpo -c -o HTTPSBridge.o HTTPSBridge.cpp mv -f .deps/HTTPSBridge.Tpo .deps/HTTPSBridge.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT sslsniff.o -MD -MP -MF .deps/sslsniff.Tpo -c -o sslsniff.o sslsniff.cpp mv -f .deps/sslsniff.Tpo .deps/sslsniff.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT FingerprintManager.o -MD -MP -MF .deps/FingerprintManager.Tpo -c -o FingerprintManager.o FingerprintManager.cpp mv -f .deps/FingerprintManager.Tpo .deps/FingerprintManager.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT AuthorityCertificateManager.o -MD -MP -MF .deps/AuthorityCertificateManager.Tpo -c -o AuthorityCertificateManager.o `test -f 'certificate/AuthorityCertificateManager.cpp' || echo './'`certificate/AuthorityCertificateManager.cpp mv -f .deps/AuthorityCertificateManager.Tpo .deps/AuthorityCertificateManager.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT TargetedCertificateManager.o -MD -MP -MF .deps/TargetedCertificateManager.Tpo -c -o TargetedCertificateManager.o `test -f 'certificate/TargetedCertificateManager.cpp' || echo './'`certificate/TargetedCertificateManager.cpp mv -f .deps/TargetedCertificateManager.Tpo .deps/TargetedCertificateManager.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT CertificateManager.o -MD -MP -MF .deps/CertificateManager.Tpo -c -o CertificateManager.o `test -f 'certificate/CertificateManager.cpp' || echo './'`certificate/CertificateManager.cpp mv -f .deps/CertificateManager.Tpo .deps/CertificateManager.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT HttpBridge.o -MD -MP -MF .deps/HttpBridge.Tpo -c -o HttpBridge.o `test -f 'http/HttpBridge.cpp' || echo './'`http/HttpBridge.cpp mv -f .deps/HttpBridge.Tpo .deps/HttpBridge.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT HttpConnectionManager.o -MD -MP -MF .deps/HttpConnectionManager.Tpo -c -o HttpConnectionManager.o `test -f 'http/HttpConnectionManager.cpp' || echo './'`http/HttpConnectionManager.cpp mv -f .deps/HttpConnectionManager.Tpo .deps/HttpConnectionManager.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT HttpHeaders.o -MD -MP -MF .deps/HttpHeaders.Tpo -c -o HttpHeaders.o `test -f 'http/HttpHeaders.cpp' || echo './'`http/HttpHeaders.cpp mv -f .deps/HttpHeaders.Tpo .deps/HttpHeaders.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT UpdateManager.o -MD -MP -MF .deps/UpdateManager.Tpo -c -o UpdateManager.o UpdateManager.cpp mv -f .deps/UpdateManager.Tpo .deps/UpdateManager.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT OCSPDenier.o -MD -MP -MF .deps/OCSPDenier.Tpo -c -o OCSPDenier.o `test -f 'http/OCSPDenier.cpp' || echo './'`http/OCSPDenier.cpp mv -f .deps/OCSPDenier.Tpo .deps/OCSPDenier.Po g++ -DPACKAGE_NAME=\"\" -DPACKAGE_TARNAME=\"\" -DPACKAGE_VERSION=\"\" -DPACKAGE_STRING=\"\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE=\"sslsniff\" -DVERSION=\"0.6\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -I. -ggdb -g -O2 -MT FirefoxAddonUpdater.o -MD -MP -MF .deps/FirefoxAddonUpdater.Tpo -c -o FirefoxAddonUpdater.o FirefoxAddonUpdater.cpp mv -f .deps/FirefoxAddonUpdater.Tpo .deps/FirefoxAddonUpdater.Po g++ -ggdb -g -O2 -lssl -lboost_filesystem -lpthread -lboost_thread -llog4cpp -o sslsniff SSLConnectionManager.o FirefoxUpdater.o Logger.o SessionCache.o SSLBridge.o HTTPSBridge.o sslsniff.o FingerprintManager.o AuthorityCertificateManager.o TargetedCertificateManager.o CertificateManager.o HttpBridge.o HttpConnectionManager.o HttpHeaders.o UpdateManager.o OCSPDenier.o FirefoxAddonUpdater.o SSLConnectionManager.o: In function `__static_initialization_and_destruction_0': /usr/local/include/boost/system/error_code.hpp:208: undefined reference to `boost::system::get_system_category()' /usr/local/include/boost/system/error_code.hpp:209: undefined reference to `boost::system::get_generic_category()' /usr/local/include/boost/system/error_code.hpp:214: undefined reference to `boost::system::get_generic_category()' /usr/local/include/boost/system/error_code.hpp:215: undefined reference to `boost::system::get_generic_category()' /usr/local/include/boost/system/error_code.hpp:216: undefined reference to `boost::system::get_system_category()' There's more, but I guess there's a post length limit.. Most of them appear related to boost::system so I added -lboost_system to the linker command and got farther: # g++ -ggdb -g -O2 -lssl -lboost_filesystem -lpthread -lboost_thread -llog4cpp -o sslsniff SSLConnectionManager.o FirefoxUpdater.o Logger.o SessionCache.o SSLBridge.o HTTPSBridge.o sslsniff.o FingerprintManager.o AuthorityCertificateManager.o TargetedCertificateManager.o CertificateManager.o HttpBridge.o HttpConnectionManager.o HttpHeaders.o UpdateManager.o OCSPDenier.o FirefoxAddonUpdater.o -lboost_system SSLConnectionManager.o: In function `thread<boost::_bi::bind_t<void, boost::_mfi::mf3<void, SSLConnectionManager, boost::shared_ptr<boost::asio::basic_stream_socket<boost::asio::ip::tcp, boost::asio::stream_socket_service<boost::asio::ip::tcp> > >, boost::asio::ip::basic_endpoint<boost::asio::ip::tcp>, bool>, boost::_bi::list4<boost::_bi::value<SSLConnectionManager*>, boost::_bi::value<boost::shared_ptr<boost::asio::basic_stream_socket<boost::asio::ip::tcp, boost::asio::stream_socket_service<boost::asio::ip::tcp> > > >, boost::_bi::value<boost::asio::ip::basic_endpoint<boost::asio::ip::tcp> >, boost::_bi::value<bool> > > >': /usr/local/include/boost/thread/detail/thread.hpp:191: undefined reference to `boost::thread::start_thread()' SSLConnectionManager.o: In function `~thread_data': /usr/local/include/boost/thread/detail/thread.hpp:40: undefined reference to `boost::detail::thread_data_base::~thread_data_base()' /usr/local/include/boost/thread/detail/thread.hpp:40: undefined reference to `boost::detail::thread_data_base::~thread_data_base()' /usr/local/include/boost/thread/detail/thread.hpp:40: undefined reference to `boost::detail::thread_data_base::~thread_data_base()' /usr/local/include/boost/thread/detail/thread.hpp:40: undefined reference to `boost::detail::thread_data_base::~thread_data_base()' Now the errors are related to boost::detail and boost::filesystem::detail. I've tried using boost 1.35 and 1.42 (latest). On my own Ubuntu system, I installed the libraries from Ubuntu repositories and I was able to compile+link sslsniff just fine. Thanks in advance.

    Read the article

  • Sampler referencing in HLSL - Sampler parameter must come from a literal expression

    - by user1423893
    The following method works fine when referencing a sampler in HLSL float3 P = lightScreenPos; sampler ShadowSampler = DPFrontShadowSampler; float depth; if (alpha >= 0.5f) { // Reference the correct sampler ShadowSampler = DPFrontShadowSampler; // Front hemisphere 'P0' P.z = P.z + 1.0; P.x = P.x / P.z; P.y = P.y / P.z; P.z = lightLength / LightAttenuation.z; // Rescale viewport to be [0, 1] (texture coordinate space) P.x = 0.5f * P.x + 0.5f; P.y = -0.5f * P.y + 0.5f; depth = tex2D(ShadowSampler, P.xy).x; depth = 1.0 - depth; } else { // Reference the correct sampler ShadowSampler = DPBackShadowSampler; // Back hemisphere 'P1' P.z = 1.0 - P.z; P.x = P.x / P.z; P.y = P.y / P.z; P.z = lightLength / LightAttenuation.z; // Rescale viewport to be [0, 1] (texture coordinate space) P.x = 0.5f * P.x + 0.5f; P.y = -0.5f * P.y + 0.5f; depth = tex2D(ShadowSampler, P.xy).x; depth = 1.0 - depth; } // [Standard Depth Calculation] float mydepth = P.z; shadow = depth + Bias.x < mydepth ? 0.0f : 1.0f; If I try and do anything with the sampler reference outside the if statement then I get the following error: Sampler parameter must come from a literal expression This code demonstrates that float3 P = lightScreenPos; sampler ShadowSampler = DPFrontShadowSampler; if (alpha >= 0.5f) { // Reference the correct sampler ShadowSampler = DPFrontShadowSampler; // Front hemisphere 'P0' P.z = P.z + 1.0; P.x = P.x / P.z; P.y = P.y / P.z; P.z = lightLength / LightAttenuation.z; } else { // Reference the correct sampler ShadowSampler = DPBackShadowSampler; // Back hemisphere 'P1' P.z = 1.0 - P.z; P.x = P.x / P.z; P.y = P.y / P.z; P.z = lightLength / LightAttenuation.z; } // Rescale viewport to be [0, 1] (texture coordinate space) P.x = 0.5f * P.x + 0.5f; P.y = -0.5f * P.y + 0.5f; // [Standard Depth Calculation] float depth = tex2D(ShadowSampler, P.xy).x; depth = 1.0 - depth; float mydepth = P.z; shadow = depth + Bias.x < mydepth ? 0.0f : 1.0f; How can I reference the sampler in this manner without triggering the error?

    Read the article

  • SQLAuthority News – Microsoft SQL Server Protocol Documentation Download

    - by pinaldave
    The Microsoft SQL Server protocol documentation provides detailed technical specifications for Microsoft proprietary protocols (including extensions to industry-standard or other published protocols) that are implemented and used in Microsoft SQL Server to interoperate or communicate with Microsoft products. The documentation includes a set of companion overview and reference documents that supplement the technical specifications with conceptual background, overviews of inter-protocol relationships and interactions, and technical reference information. Microsoft SQL Server Protocol Documentation Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Back to Basics: When does a .NET Assembly Dependency get loaded

    - by Rick Strahl
    When we work on typical day to day applications, it's easy to forget some of the core features of the .NET framework. For me personally it's been a long time since I've learned about some of the underlying CLR system level services even though I rely on them on a daily basis. I often think only about high level application constructs and/or high level framework functionality, but the low level stuff is often just taken for granted. Over the last week at DevConnections I had all sorts of low level discussions with other developers about the inner workings of this or that technology (especially in light of my Low Level ASP.NET Architecture talk and the Razor Hosting talk). One topic that came up a couple of times and ended up a point of confusion even amongst some seasoned developers (including some folks from Microsoft <snicker>) is when assemblies actually load into a .NET process. There are a number of different ways that assemblies are loaded in .NET. When you create a typical project assemblies usually come from: The Assembly reference list of the top level 'executable' project The Assembly references of referenced projects Dynamically loaded at runtime via AppDomain/Reflection loading In addition .NET automatically loads mscorlib (most of the System namespace) the boot process that hosts the .NET runtime in EXE apps, or some other kind of runtime hosting environment (runtime hosting in servers like IIS, SQL Server or COM Interop). In hosting environments the runtime host may also pre-load a bunch of assemblies on its own (for example the ASP.NET host requires all sorts of assemblies just to run itself, before ever routing into your user specific code). Assembly Loading The most obvious source of loaded assemblies is the top level application's assembly reference list. You can add assembly references to a top level application and those assembly references are then available to the application. In a nutshell, referenced assemblies are not immediately loaded - they are loaded on the fly as needed. So regardless of whether you have an assembly reference in a top level project, or a dependent assembly assemblies typically load on an as needed basis, unless explicitly loaded by user code. The same is true of dependent assemblies. To check this out I ran a simple test: I have a utility assembly Westwind.Utilities which is a general purpose library that can work in any type of project. Due to a couple of small requirements for encoding and a logging piece that allows logging Web content (dependency on HttpContext.Current) this utility library has a dependency on System.Web. Now System.Web is a pretty large assembly and generally you'd want to avoid adding it to a non-Web project if it can be helped. So I created a Console Application that loads my utility library: You can see that the top level Console app a reference to Westwind.Utilities and System.Data (beyond the core .NET libs). The Westwind.Utilities project on the other hand has quite a few dependencies including System.Web. I then add a main program that accesses only a simple utillity method in the Westwind.Utilities library that doesn't require any of the classes that access System.Web: static void Main(string[] args) { Console.WriteLine(StringUtils.NewStringId()); Console.ReadLine(); } StringUtils.NewStringId() calls into Westwind.Utilities, but it doesn't rely on System.Web. Any guesses what the assembly list looks like when I stop the code on the ReadLine() command? I'll wait here while you think about it… … … So, when I stop on ReadLine() and then fire up Process Explorer and check the assembly list I get: We can see here that .NET has not actually loaded any of the dependencies of the Westwind.Utilities assembly. Also not loaded is the top level System.Data reference even though it's in the dependent assembly list of the top level project. Since this particular function I called only uses core System functionality (contained in mscorlib) there's in fact nothing else loaded beyond the main application and my Westwind.Utilities assembly that contains the method accessed. None of the dependencies of Westwind.Utilities loaded. If you were to open the assembly in a disassembler like Reflector or ILSpy, you would however see all the compiled in dependencies. The referenced assemblies are in the dependency list and they are loadable, but they are not immediately loaded by the application. In other words the C# compiler and .NET linker are smart enough to figure out the dependencies based on the code that actually is referenced from your application and any dependencies cascading down into the dependencies from your top level application into the referenced assemblies. In the example above the usage requirement is pretty obvious since I'm only calling a single static method and then exiting the app, but in more complex applications these dependency relationships become very complicated - however it's all taken care of by the compiler and linker figuring out what types and members are actually referenced and including only those assemblies that are in fact referenced in your code or required by any of your dependencies. The good news here is: That if you are referencing an assembly that has a dependency on something like System.Web in a few places that are not actually accessed by any of your code or any dependent assembly code that you are calling, that assembly is never loaded into memory! Some Hosting Environments pre-load Assemblies The load behavior can vary however. In Console and desktop applications we have full control over assembly loading so we see the core CLR behavior. However other environments like ASP.NET for example will preload referenced assemblies explicitly as part of the startup process - primarily to minimize load conflicts. Specifically ASP.NET pre-loads all assemblies referenced in the assembly list and the /bin folder. So in Web applications it definitely pays to minimize your top level assemblies if they are not used. Understanding when Assemblies Load To clarify and see it actually happen what I described in the first example , let's look at a couple of other scenarios. To see assemblies loading at runtime in real time lets create a utility function to print out loaded assemblies to the console: public static void PrintAssemblies() { var assemblies = AppDomain.CurrentDomain.GetAssemblies(); foreach (var assembly in assemblies) { Console.WriteLine(assembly.GetName()); } } Now let's look at the first scenario where I have class method that references internally uses System.Web. In the first scenario lets add a method to my main program like this: static void Main(string[] args) { Console.WriteLine(StringUtils.NewStringId()); Console.ReadLine(); PrintAssemblies(); } public static void WebLogEntry() { var entry = new WebLogEntry(); entry.UpdateFromRequest(); Console.WriteLine(entry.QueryString); } UpdateFromWebRequest() internally accesses HttpContext.Current to read some information of the ASP.NET Request object so it clearly needs a reference System.Web to work. In this first example, the method that holds the calling code is never called, but exists as a static method that can potentially be called externally at some point. What do you think will happen here with the assembly loading? Will System.Web load in this example? No - it doesn't. Because the WebLogEntry() method is never called by the mainline application (or anywhere else) System.Web is not loaded. .NET dynamically loads assemblies as code that needs it is called. No code references the WebLogEntry() method and so System.Web is never loaded. Next, let's add the call to this method, which should trigger System.Web to be loaded because a dependency exists. Let's change the code to: static void Main(string[] args) { Console.WriteLine(StringUtils.NewStringId()); Console.WriteLine("--- Before:"); PrintAssemblies(); WebLogEntry(); Console.WriteLine("--- After:"); PrintAssemblies(); Console.ReadLine(); } public static void WebLogEntry() { var entry = new WebLogEntry(); entry.UpdateFromRequest(); Console.WriteLine(entry.QueryString); } Looking at the code now, when do you think System.Web will be loaded? Will the before list include it? Yup System.Web gets loaded, but only after it's actually referenced. In fact, just until before the call to UpdateFromRequest() System.Web is not loaded - it only loads when the method is actually called and requires the reference in the executing code. Moral of the Story So what have we learned - or maybe remembered again? Dependent Assembly References are not pre-loaded when an application starts (by default) Dependent Assemblies that are not referenced by executing code are never loaded Dependent Assemblies are just in time loaded when first referenced in code All of this is nothing new - .NET has always worked like this. But it's good to have a refresher now and then and go through the exercise of seeing it work in action. It's not one of those things we think about everyday, and as I found out last week, I couldn't remember exactly how it worked since it's been so long since I've learned about this. And apparently I'm not the only one as several other people I had discussions with in relation to loaded assemblies also didn't recall exactly what should happen or assumed incorrectly that just having a reference automatically loads the assembly. The moral of the story for me is: Trying at all costs to eliminate an assembly reference from a component is not quite as important as it's often made out to be. For example, the Westwind.Utilities module described above has a logging component, including a Web specific logging entry that supports pulling information from the active HTTP Context. Adding that feature requires a reference to System.Web. Should I worry about this in the scope of this library? Probably not, because if I don't use that one class of nearly a hundred, System.Web never gets pulled into the parent process. IOW, System.Web only loads when I use that specific feature and if I am, well I clearly have to be running in a Web environment anyway to use it realistically. The alternative would be considerably uglier: Pulling out the WebLogEntry class and sticking it into another assembly and breaking up the logging code. In this case - definitely not worth it. So, .NET definitely goes through some pretty nifty optimizations to ensure that it loads only what it needs and in most cases you can just rely on .NET to do the right thing. Sometimes though assembly loading can go wrong (especially when signed and versioned local assemblies are involved), but that's subject for a whole other post…© Rick Strahl, West Wind Technologies, 2005-2012Posted in .NET  CSharp   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • ASP.NET and HTML5 Local Storage

    - by Stephen Walther
    My favorite feature of HTML5, hands-down, is HTML5 local storage (aka DOM storage). By taking advantage of HTML5 local storage, you can dramatically improve the performance of your data-driven ASP.NET applications by caching data in the browser persistently. Think of HTML5 local storage like browser cookies, but much better. Like cookies, local storage is persistent. When you add something to browser local storage, it remains there when the user returns to the website (possibly days or months later). Importantly, unlike the cookie storage limitation of 4KB, you can store up to 10 megabytes in HTML5 local storage. Because HTML5 local storage works with the latest versions of all modern browsers (IE, Firefox, Chrome, Safari), you can start taking advantage of this HTML5 feature in your applications right now. Why use HTML5 Local Storage? I use HTML5 Local Storage in the JavaScript Reference application: http://Superexpert.com/JavaScriptReference The JavaScript Reference application is an HTML5 app that provides an interactive reference for all of the syntax elements of JavaScript (You can read more about the application and download the source code for the application here). When you open the application for the first time, all of the entries are transferred from the server to the browser (all 300+ entries). All of the entries are stored in local storage. When you open the application in the future, only changes are transferred from the server to the browser. The benefit of this approach is that the application performs extremely fast. When you click the details link to view details on a particular entry, the entry details appear instantly because all of the entries are stored on the client machine. When you perform key-up searches, by typing in the filter textbox, matching entries are displayed very quickly because the entries are being filtered on the local machine. This approach can have a dramatic effect on the performance of any interactive data-driven web application. Interacting with data on the client is almost always faster than interacting with the same data on the server. Retrieving Data from the Server In the JavaScript Reference application, I use Microsoft WCF Data Services to expose data to the browser. WCF Data Services generates a REST interface for your data automatically. Here are the steps: Create your database tables in Microsoft SQL Server. For example, I created a database named ReferenceDB and a database table named Entities. Use the Entity Framework to generate your data model. For example, I used the Entity Framework to generate a class named ReferenceDBEntities and a class named Entities. Expose your data through WCF Data Services. I added a WCF Data Service to my project and modified the data service class to look like this:   using System.Data.Services; using System.Data.Services.Common; using System.Web; using JavaScriptReference.Models; namespace JavaScriptReference.Services { [System.ServiceModel.ServiceBehavior(IncludeExceptionDetailInFaults = true)] public class EntryService : DataService<ReferenceDBEntities> { // This method is called only once to initialize service-wide policies. public static void InitializeService(DataServiceConfiguration config) { config.UseVerboseErrors = true; config.SetEntitySetAccessRule("*", EntitySetRights.All); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2; } // Define a change interceptor for the Products entity set. [ChangeInterceptor("Entries")] public void OnChangeEntries(Entry entry, UpdateOperations operations) { if (!HttpContext.Current.Request.IsAuthenticated) { throw new DataServiceException("Cannot update reference unless authenticated."); } } } }     The WCF data service is named EntryService. Notice that it derives from DataService<ReferenceEntitites>. Because it derives from DataService<ReferenceEntities>, the data service exposes the contents of the ReferenceEntitiesDB database. In the code above, I defined a ChangeInterceptor to prevent un-authenticated users from making changes to the database. Anyone can retrieve data through the service, but only authenticated users are allowed to make changes. After you expose data through a WCF Data Service, you can use jQuery to retrieve the data by performing an Ajax call. For example, I am using an Ajax call that looks something like this to retrieve the JavaScript entries from the EntryService.svc data service: $.ajax({ dataType: "json", url: “/Services/EntryService.svc/Entries”, success: function (result) { var data = callback(result["d"]); } });     Notice that you must unwrap the data using result[“d”]. After you unwrap the data, you have a JavaScript array of the entries. I’m transferring all 300+ entries from the server to the client when the application is opened for the first time. In other words, I transfer the entire database from the server to the client, once and only once, when the application is opened for the first time. The data is transferred using JSON. Here is a fragment: { "d" : [ { "__metadata": { "uri": "http://superexpert.com/javascriptreference/Services/EntryService.svc/Entries(1)", "type": "ReferenceDBModel.Entry" }, "Id": 1, "Name": "Global", "Browsers": "ff3_6,ie8,ie9,c8,sf5,es3,es5", "Syntax": "object", "ShortDescription": "Contains global variables and functions", "FullDescription": "<p>\nThe Global object is determined by the host environment. In web browsers, the Global object is the same as the windows object.\n</p>\n<p>\nYou can use the keyword <code>this</code> to refer to the Global object when in the global context (outside of any function).\n</p>\n<p>\nThe Global object holds all global variables and functions. For example, the following code demonstrates that the global <code>movieTitle</code> variable refers to the same thing as <code>window.movieTitle</code> and <code>this.movieTitle</code>.\n</p>\n<pre>\nvar movieTitle = \"Star Wars\";\nconsole.log(movieTitle === this.movieTitle); // true\nconsole.log(movieTitle === window.movieTitle); // true\n</pre>\n", "LastUpdated": "634298578273756641", "IsDeleted": false, "OwnerId": null }, { "__metadata": { "uri": "http://superexpert.com/javascriptreference/Services/EntryService.svc/Entries(2)", "type": "ReferenceDBModel.Entry" }, "Id": 2, "Name": "eval(string)", "Browsers": "ff3_6,ie8,ie9,c8,sf5,es3,es5", "Syntax": "function", "ShortDescription": "Evaluates and executes JavaScript code dynamically", "FullDescription": "<p>\nThe following code evaluates and executes the string \"3+5\" at runtime.\n</p>\n<pre>\nvar result = eval(\"3+5\");\nconsole.log(result); // returns 8\n</pre>\n<p>\nYou can rewrite the code above like this:\n</p>\n<pre>\nvar result;\neval(\"result = 3+5\");\nconsole.log(result);\n</pre>", "LastUpdated": "634298580913817644", "IsDeleted": false, "OwnerId": 1 } … ]} I worried about the amount of time that it would take to transfer the records. According to Google Chome, it takes about 5 seconds to retrieve all 300+ records on a broadband connection over the Internet. 5 seconds is a small price to pay to avoid performing any server fetches of the data in the future. And here are the estimated times using different types of connections using Fiddler: Notice that using a modem, it takes 33 seconds to download the database. 33 seconds is a significant chunk of time. So, I would not use the approach of transferring the entire database up front if you expect a significant portion of your website audience to connect to your website with a modem. Adding Data to HTML5 Local Storage After the JavaScript entries are retrieved from the server, the entries are stored in HTML5 local storage. Here’s the reference documentation for HTML5 storage for Internet Explorer: http://msdn.microsoft.com/en-us/library/cc197062(VS.85).aspx You access local storage by accessing the windows.localStorage object in JavaScript. This object contains key/value pairs. For example, you can use the following JavaScript code to add a new item to local storage: <script type="text/javascript"> window.localStorage.setItem("message", "Hello World!"); </script>   You can use the Google Chrome Storage tab in the Developer Tools (hit CTRL-SHIFT I in Chrome) to view items added to local storage: After you add an item to local storage, you can read it at any time in the future by using the window.localStorage.getItem() method: <script type="text/javascript"> window.localStorage.setItem("message", "Hello World!"); </script>   You only can add strings to local storage and not JavaScript objects such as arrays. Therefore, before adding a JavaScript object to local storage, you need to convert it into a JSON string. In the JavaScript Reference application, I use a wrapper around local storage that looks something like this: function Storage() { this.get = function (name) { return JSON.parse(window.localStorage.getItem(name)); }; this.set = function (name, value) { window.localStorage.setItem(name, JSON.stringify(value)); }; this.clear = function () { window.localStorage.clear(); }; }   If you use the wrapper above, then you can add arbitrary JavaScript objects to local storage like this: var store = new Storage(); // Add array to storage var products = [ {name:"Fish", price:2.33}, {name:"Bacon", price:1.33} ]; store.set("products", products); // Retrieve items from storage var products = store.get("products");   Modern browsers support the JSON object natively. If you need the script above to work with older browsers then you should download the JSON2.js library from: https://github.com/douglascrockford/JSON-js The JSON2 library will use the native JSON object if a browser already supports JSON. Merging Server Changes with Browser Local Storage When you first open the JavaScript Reference application, the entire database of JavaScript entries is transferred from the server to the browser. Two items are added to local storage: entries and entriesLastUpdated. The first item contains the entire entries database (a big JSON string of entries). The second item, a timestamp, represents the version of the entries. Whenever you open the JavaScript Reference in the future, the entriesLastUpdated timestamp is passed to the server. Only records that have been deleted, updated, or added since entriesLastUpdated are transferred to the browser. The OData query to get the latest updates looks like this: http://superexpert.com/javascriptreference/Services/EntryService.svc/Entries?$filter=(LastUpdated%20gt%20634301199890494792L) If you remove URL encoding, the query looks like this: http://superexpert.com/javascriptreference/Services/EntryService.svc/Entries?$filter=(LastUpdated gt 634301199890494792L) This query returns only those entries where the value of LastUpdated > 634301199890494792 (the version timestamp). The changes – new JavaScript entries, deleted entries, and updated entries – are merged with the existing entries in local storage. The JavaScript code for performing the merge is contained in the EntriesHelper.js file. The merge() method looks like this:   merge: function (oldEntries, newEntries) { // concat (this performs the add) oldEntries = oldEntries || []; var mergedEntries = oldEntries.concat(newEntries); // sort this.sortByIdThenLastUpdated(mergedEntries); // prune duplicates (this performs the update) mergedEntries = this.pruneDuplicates(mergedEntries); // delete mergedEntries = this.removeIsDeleted(mergedEntries); // Sort this.sortByName(mergedEntries); return mergedEntries; },   The contents of local storage are then updated with the merged entries. I spent several hours writing the merge() method (much longer than I expected). I found two resources to be extremely useful. First, I wrote extensive unit tests for the merge() method. I wrote the unit tests using server-side JavaScript. I describe this approach to writing unit tests in this blog entry. The unit tests are included in the JavaScript Reference source code. Second, I found the following blog entry to be super useful (thanks Nick!): http://nicksnettravels.builttoroam.com/post/2010/08/03/OData-Synchronization-with-WCF-Data-Services.aspx One big challenge that I encountered involved timestamps. I originally tried to store an actual UTC time as the value of the entriesLastUpdated item. I quickly discovered that trying to work with dates in JSON turned out to be a big can of worms that I did not want to open. Next, I tried to use a SQL timestamp column. However, I learned that OData cannot handle the timestamp data type when doing a filter query. Therefore, I ended up using a bigint column in SQL and manually creating the value when a record is updated. I overrode the SaveChanges() method to look something like this: public override int SaveChanges(SaveOptions options) { var changes = this.ObjectStateManager.GetObjectStateEntries( EntityState.Modified | EntityState.Added | EntityState.Deleted); foreach (var change in changes) { var entity = change.Entity as IEntityTracking; if (entity != null) { entity.LastUpdated = DateTime.Now.Ticks; } } return base.SaveChanges(options); }   Notice that I assign Date.Now.Ticks to the entity.LastUpdated property whenever an entry is modified, added, or deleted. Summary After building the JavaScript Reference application, I am convinced that HTML5 local storage can have a dramatic impact on the performance of any data-driven web application. If you are building a web application that involves extensive interaction with data then I recommend that you take advantage of this new feature included in the HTML5 standard.

    Read the article

  • Integrate BING API for Search inside ASP.Net web application

    - by sreejukg
    As you might already know, Bing is the Microsoft Search engine and is getting popular day by day. Bing offers APIs that can be integrated into your website to increase your website functionality. At this moment, there are two important APIs available. They are Bing Search API Bing Maps The Search API enables you to build applications that utilize Bing’s technology. The API allows you to search multiple source types such as web; images, video etc. and supports various output prototypes such as JSON, XML, and SOAP. Also you will be able to customize the search results as you wish for your public facing website. Bing Maps API allows you to build robust applications that use Bing Maps. In this article I am going to describe, how you can integrate Bing search into your website. In order to start using Bing, First you need to sign in to http://www.bing.com/toolbox/bingdeveloper/ using your windows live credentials. Click on the Sign in button, you will be asked to enter your windows live credentials. Once signed in you will be redirected to the Developer page. Here you can create applications and get AppID for each application. Since I am a first time user, I don’t have any applications added. Click on the Add button to add a new application. You will be asked to enter certain details about your application. The fields are straight forward, only thing you need to note is the website field, here you need to enter the website address from where you are going to use this application, and this field is optional too. Of course you need to agree on the terms and conditions and then click Save. Once you click on save, the application will be created and application ID will be available for your use. Now we got the APP Id. Basically Bing supports three protocols. They are JSON, XML and SOAP. JSON is useful if you want to call the search requests directly from the browser and use JavaScript to parse the results, thus JSON is the favorite choice for AJAX application. XML is the alternative for applications that does not support SOAP, e.g. flash/ Silverlight etc. SOAP is ideal for strongly typed languages and gives a request/response object model. In this article I am going to demonstrate how to search BING API using SOAP protocol from an ASP.Net application. For the purpose of this demonstration, I am going to create an ASP.Net project and implement the search functionality in an aspx page. Open Visual Studio, navigate to File-> New Project, select ASP.Net empty web application, I named the project as “BingSearchSample”. Add a Search.aspx page to the project, once added the solution explorer will looks similar to the following. Now you need to add a web reference to the SOAP service available from Bing. To do this, from the solution explorer, right click your project, select Add Service Reference. Now the new service reference dialog will appear. In the left bottom of the dialog, you can find advanced button, click on it. Now the service reference settings dialog will appear. In the bottom left, you can find Add Web Reference button, click on it. The add web reference dialog will appear now. Enter the URL as http://api.bing.net/search.wsdl?AppID=<YourAppIDHere>&version=2.2 (replace <yourAppIDHere> with the appID you have generated previously) and click on the button next to it. This will find the web service methods available. You can change the namespace suggested by Bing, but for the purpose of this demonstration I have accepted all the default settings. Click on the Add reference button once you are done. Now the web reference to Search service will be added your project. You can find this under solution explorer of your project. Now in the Search.aspx, that you previously created, place one textbox, button and a grid view. For the purpose of this demonstration, I have given the identifiers (ID) as txtSearch, btnSearch, gvSearch respectively. The idea is to search the text entered in the text box using Bing service and show the results in the grid view. In the design view, the search.aspx looks as follows. In the search.aspx.cs page, add a using statement that points to net.bing.api. I have added the following code for button click event handler. The code is very straight forward. It just calls the service with your AppID, a query to search and a source for searching. Let us run this page and see the output when I enter Microsoft in my textbox. If you want to search a specific site, you can include the site name in the query parameter. For e.g. the following query will search the word Microsoft from www.microsoft.com website. searchRequest.Query = “site:www.microsoft.com Microsoft”; The output of this query is as follows. Integrating BING search API to your website is easy and there is no limit on the customization of the interface you can do. There is no Bing branding required so I believe this is a great option for web developers when they plan for site search.

    Read the article

  • How to reproject a shapefile from WGS 84 to Spherical/Web Mercator projection.

    - by samkea
    Definitions: You will need to know the meaning of these terms below. I have given a small description to the acronyms but you can google and know more about them. #1:WGS-84- World Geodetic Systems (1984)- is a standard reference coordinate system used for Cartography, Geodesy and Navigation. #2: EPGS-European Petroleum Survey Group-was a scientific organization with ties to the European petroleum industry consisting of specialists working in applied geodesy, surveying, and cartography related to oil exploration. EPSG::4326 is a common coordinate reference system that refers to WGS84 as (latitude, longitude) pair coordinates in degrees with Greenwich as the central meridian. Any degree representation (e.g., decimal or DMSH: degrees minutes seconds hemisphere) may be used. Which degree representation is used must be declared for the user by the supplier of data. So, the Spherical/Web Mercator projection is referred to as EPGS::3785 which is renamed to EPSG:900913 by google for use in googlemaps. The associated CRS(Coordinate Reference System) for this is the "Popular Visualisation CRS / Mercator ". This is the kind of projection that is used by GoogleMaps, BingMaps,OSM,Virtual Earth, Deep Earth excetra...to show interactive maps over the web with thier nearly precise coordinates.  Reprojection: After reading alot about reprojecting my coordinates from the deepearth project on Codeplex, i still could not do it. After some help from a colleague, i got my ball rolling.This is how i did it. #1 You need to download and open your shapefile using Q-GIS; its the one with the biggest number of coordinate reference systems/ projections. #2 Use the plugins menu, and enable ftools and the WFS plugin. #3 Use the Vector menu--> Data Management Tools and choose define current projection. Enable, use predefined reference system and choose WGS 84 coodinate system. I am personally in zone 36, so i chose WGS84-UTM Zone 36N under ( Projected Coordinate Systems--> Universal Transverse Mercator) and click ok. #4 Now use the Vector menu--> Data Management Tools and choose export to new projection. The same dialog will pop-up. Now choose WGS 84 EPGS::4326 under Geodetic Coordinate Systems. My Input user Defined Spatial Reference System should looks like this: +proj=tmerc +lat_0=0 +lon_0=33 +k=0.9996 +x_0=500000 +y_0=200000 +ellps=WGS84 +datum=WGS84 +units=m +no_defs Your Output user Defined Spatial Reference System should look like this: +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs Browse for the place where the shapefile is going to be and give the shapefile a name(like origna_reprojected). If it prompts you to add the projected layer to the TOC, accept. There, you have your re-projected map with latitude and longitude pair of coordinates. #5 Now, this is not the actual Spherical/Web Mercator projection, but dont worry, this is where you have to stop. All the other custom web-mapping portals will pick this projection and transform it into EPGS::3785 or EPSG:900913 but the coordinates will still remain as the LatLon pair of the projected shapefile. If you want to test, a particular know point, Q-GIS has a lot of room for that. Go ahead and test it.

    Read the article

  • Upgraded Linux, now CMS Made Simple is spewing errors

    - by Paul Tomblin
    I upgraded my host from Debian Lenny to Debian Squeeze, and now my CMS Made Simple site is spewing PHP errors all over the screen. I thought I'd upgrade the CMS because I haven't done so in a while, but Google Chrome tells me that the CMS Made Simple site is infested with malware. What are my options now? Example errors: Deprecated: Assigning the return value of new by reference is deprecated in /www/danmurn/cms/include.php on line 73 Deprecated: Assigning the return value of new by reference is deprecated in /www/danmurn/cms/include.php on line 162 Deprecated: Assigning the return value of new by reference is deprecated in /www/danmurn/cms/include.php on line 240 Warning: session_start() [function.session-start]: Cannot send session cookie - headers already sent by (output started at /www/danmurn/cms/include.php:73) in /www/danmurn/cms/include.php on line 34 Warning: session_start() [function.session-start]: Cannot send session cache limiter - headers already sent (output started at /www/danmurn/cms/include.php:73) in /www/danmurn/cms/include.php on line 34 Deprecated: Function set_magic_quotes_runtime() is deprecated in /www/danmurn/cms/include.php on line 62 Deprecated: Assigning the return value of new by reference is deprecated in /www/danmurn/cms/lib/classes/class.global.inc.php on line 184 Deprecated: Assigning the return value of new by reference is deprecated in /www/danmurn/cms/lib/classes/class.global.inc.php on line 196

    Read the article

  • Recent ALM Rangers Summary Posts

    - by Enrique Lima
    Willy-Peter Schaub has been a machine producing and posting content.  It is such a gem of information that you can find in the content being posted.  He has created some Summary Posts on the specific topics the ALM Rangers are working on. Here is the list of quick access TOC Posts. TOC: “Tags” a la acronyms … what do they all mean? TOC: TFS Integration Tools Blog Posts and Reference Sites TOC: TFS Iteration Automation Blog Posts and Reference Sites TOC: Virtual Machine (VM) Factory TOC: Build Customization Guide Blog Posts and Reference Sites TOC: Lab Management Guide Blog Posts and Reference Sites

    Read the article

< Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >