Search Results

Search found 16189 results on 648 pages for 'document conversion'.

Page 3/648 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Flatten Word document

    - by user126389
    I have a document with some precise formatting, created in Word. This doc was converted to PDF for distribution. Now the original is lost, and reconverting to Word using a PDF to word add-on from Microsoft results in many text boxes in the new DOC file. How can I 'flatten' this to remove the text boxes and retain most of the formatting in order to update the contents? Recreating the original formatting would take a long time.

    Read the article

  • Magento Onepage Success Conversion Tracking Design Pattern

    - by user1734954
    My intent is to track conversions through multiple channels by inserting third party javascript (for example google analytics, optimizely, pricegrabber etc.) into the footer of onepage success . I've accomplished this by adding a block to the footer reference inside of the checkout success node within local.xml and everything works appropriately. My questions are more about efficiency and extensibility. It occurred to me that it would be better to combine all of the blocks into a single block reference and then use a various methods acting on a single call to the various related models to provide the data needed for insertion into the javascript for each of the conversion tracking scripts. Some examples of the common data that conversion tracking may rely on(pseudo): Order ID , Order Total, Order.LineItem.Name(foreach) and so on Currently for each of the scripts I've made a call to the appropriate model passing the customers last order id as the load value and the calling a get() assigning the return value to a variable and then iterating through the data to match the values with the expectations of the given third party service. All of the data should be pulled once when checkout is complete each third party services may expect different data in different formats Here is an example of one of the conversion tracking template files which loads at the footer of checkout success. $order = Mage::getModel('sales/order')->loadByIncrementId(Mage::getSingleton('checkout/session')->getLastRealOrderId()); $amount = number_format($order->getGrandTotal(),2); $customer = Mage::helper('customer')->getCustomer()->getData(); ?> <script type="text/javascript"> popup_email = '<?php echo($customer['email']);?>'; popup_order_number = '<?php echo $this->getOrderId() ?>'; </script> <!-- PriceGrabber Merchant Evaluation Code --> <script type="text/javascript" charset="UTF-8" src="https://www.pricegrabber.com/rating_merchrevpopjs.php?retid=<something>"></script> <noscript><a href="http://www.pricegrabber.com/rating_merchrev.php?retid=<something>" target=_blank> <img src="https://images.pricegrabber.com/images/mr_noprize.jpg" border="0" width="272" height="238" alt="Merchant Evaluation"></a></noscript> <!-- End PriceGrabber Code --> Having just a single piece of code like this is not that big of a deal, but we are doing similar things with a number of different third party services. Pricegrabber is one of the simpler examples. A more sophisticated tracking service expects a comma separated list of all of the product names, ids, prices, categories , order id etc. I would like to make it all more manageable so my idea to do the following: combine all of the template files into a single file Develop a helper class or library to deliver the data to the conversion template Goals Include Extensibility Minimal Model Calls Minimal Method Calls The Questions 1. Is a Mage helper the best route to take? 2. Is there any design pattern you may recommend for the "helper" class? 3. Why would this the design pattern you've chosen be best for this instance?

    Read the article

  • Update a document onto existing google document using Zend framework

    - by Ali
    Hi guys I'm dabbling in google docs with Zend GData library - and succeeded to upload documents to google docs. However I would like to know how would it be possibel for me to upload a document and overwrite the document on google docs? Assume that I just have the docid which refers to the document on google docs. Thanks again - I'm using Php and the Zend Gdata libraries

    Read the article

  • Currency Conversion in Oracle BI applications

    - by Saurabh Verma
    Authored by Vijay Aggarwal and Hichem Sellami A typical data warehouse contains Star and/or Snowflake schema, made up of Dimensions and Facts. The facts store various numerical information including amounts. Example; Order Amount, Invoice Amount etc. With the true global nature of business now-a-days, the end-users want to view the reports in their own currency or in global/common currency as defined by their business. This presents a unique opportunity in BI to provide the amounts in converted rates either by pre-storing or by doing on-the-fly conversions while displaying the reports to the users. Source Systems OBIA caters to various source systems like EBS, PSFT, Sebl, JDE, Fusion etc. Each source has its own unique and intricate ways of defining and storing currency data, doing currency conversions and presenting to the OLTP users. For example; EBS stores conversion rates between currencies which can be classified by conversion rates, like Corporate rate, Spot rate, Period rate etc. Siebel stores exchange rates by conversion rates like Daily. EBS/Fusion stores the conversion rates for each day, where as PSFT/Siebel store for a range of days. PSFT has Rate Multiplication Factor and Rate Division Factor and we need to calculate the Rate based on them, where as other Source systems store the Currency Exchange Rate directly. OBIA Design The data consolidation from various disparate source systems, poses the challenge to conform various currencies, rate types, exchange rates etc., and designing the best way to present the amounts to the users without affecting the performance. When consolidating the data for reporting in OBIA, we have designed the mechanisms in the Common Dimension, to allow users to report based on their required currencies. OBIA Facts store amounts in various currencies: Document Currency: This is the currency of the actual transaction. For a multinational company, this can be in various currencies. Local Currency: This is the base currency in which the accounting entries are recorded by the business. This is generally defined in the Ledger of the company. Global Currencies: OBIA provides five Global Currencies. Three are used across all modules. The last two are for CRM only. A Global currency is very useful when creating reports where the data is viewed enterprise-wide. Example; a US based multinational would want to see the reports in USD. The company will choose USD as one of the global currencies. OBIA allows users to define up-to five global currencies during the initial implementation. The term Currency Preference is used to designate the set of values: Document Currency, Local Currency, Global Currency 1, Global Currency 2, Global Currency 3; which are shared among all modules. There are four more currency preferences, specific to certain modules: Global Currency 4 (aka CRM Currency) and Global Currency 5 which are used in CRM; and Project Currency and Contract Currency, used in Project Analytics. When choosing Local Currency for Currency preference, the data will show in the currency of the Ledger (or Business Unit) in the prompt. So it is important to select one Ledger or Business Unit when viewing data in Local Currency. More on this can be found in the section: Toggling Currency Preferences in the Dashboard. Design Logic When extracting the fact data, the OOTB mappings extract and load the document amount, and the local amount in target tables. It also loads the exchange rates required to convert the document amount into the corresponding global amounts. If the source system only provides the document amount in the transaction, the extract mapping does a lookup to get the Local currency code, and the Local exchange rate. The Load mapping then uses the local currency code and rate to derive the local amount. The load mapping also fetches the Global Currencies and looks up the corresponding exchange rates. The lookup of exchange rates is done via the Exchange Rate Dimension provided as a Common/Conforming Dimension in OBIA. The Exchange Rate Dimension stores the exchange rates between various currencies for a date range and Rate Type. Two physical tables W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are used to provide the lookups and conversions between currencies. The data is loaded from the source system’s Ledger tables. W_EXCH_RATE_G stores the exchange rates between currencies with a date range. On the other hand, W_GLOBAL_EXCH_RATE_G stores the currency conversions between the document currency and the pre-defined five Global Currencies for each day. Based on the requirements, the fact mappings can decide and use one or both tables to do the conversion. Currency design in OBIA also taps into the MLS and Domain architecture, thus allowing the users to map the currencies to a universal Domain during the implementation time. This is especially important for companies deploying and using OBIA with multiple source adapters. Some Gotchas to Look for It is necessary to think through the currencies during the initial implementation. 1) Identify various types of currencies that are used by your business. Understand what will be your Local (or Base) and Documentation currency. Identify various global currencies that your users will want to look at the reports. This will be based on the global nature of your business. Changes to these currencies later in the project, while permitted, but may cause Full data loads and hence lost time. 2) If the user has a multi source system make sure that the Global Currencies and Global Rate Types chosen in Configuration Manager do have the corresponding source specific counterparts. In other words, make sure for every DW specific value chosen for Currency Code or Rate Type, there is a source Domain mapping already done. Technical Section This section will briefly mention the technical scenarios employed in the OBIA adaptors to extract data from each source system. In OBIA, we have two main tables which store the Currency Rate information as explained in previous sections. W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are the two tables. W_EXCH_RATE_G stores all the Currency Conversions present in the source system. It captures data for a Date Range. W_GLOBAL_EXCH_RATE_G has Global Currency Conversions stored at a Daily level. However the challenge here is to store all the 5 Global Currency Exchange Rates in a single record for each From Currency. Let’s voyage further into the Source System Extraction logic for each of these tables and understand the flow briefly. EBS: In EBS, we have Currency Data stored in GL_DAILY_RATES table. As the name indicates GL_DAILY_RATES EBS table has data at a daily level. However in our warehouse we store the data with a Date Range and insert a new range record only when the Exchange Rate changes for a particular From Currency, To Currency and Rate Type. Below are the main logical steps that we employ in this process. (Incremental Flow only) – Cleanup the data in W_EXCH_RATE_G. Delete the records which have Start Date > minimum conversion date Update the End Date of the existing records. Compress the daily data from GL_DAILY_RATES table into Range Records. Incremental map uses $$XRATE_UPD_NUM_DAY as an extra parameter. Generate Previous Rate, Previous Date and Next Date for each of the Daily record from the OLTP. Filter out the records which have Conversion Rate same as Previous Rates or if the Conversion Date lies within a single day range. Mark the records as ‘Keep’ and ‘Filter’ and also get the final End Date for the single Range record (Unique Combination of From Date, To Date, Rate and Conversion Date). Filter the records marked as ‘Filter’ in the INFA map. The above steps will load W_EXCH_RATE_GS. Step 0 updates/deletes W_EXCH_RATE_G directly. SIL map will then insert/update the GS data into W_EXCH_RATE_G. These steps convert the daily records in GL_DAILY_RATES to Range records in W_EXCH_RATE_G. We do not need such special logic for loading W_GLOBAL_EXCH_RATE_G. This is a table where we store data at a Daily Granular Level. However we need to pivot the data because the data present in multiple rows in source tables needs to be stored in different columns of the same row in DW. We use GROUP BY and CASE logic to achieve this. Fusion: Fusion has extraction logic very similar to EBS. The only difference is that the Cleanup logic that was mentioned in step 0 above does not use $$XRATE_UPD_NUM_DAY parameter. In Fusion we bring all the Exchange Rates in Incremental as well and do the cleanup. The SIL then takes care of Insert/Updates accordingly. PeopleSoft:PeopleSoft does not have From Date and To Date explicitly in the Source tables. Let’s look at an example. Please note that this is achieved from PS1 onwards only. 1 Jan 2010 – USD to INR – 45 31 Jan 2010 – USD to INR – 46 PSFT stores records in above fashion. This means that Exchange Rate of 45 for USD to INR is applicable for 1 Jan 2010 to 30 Jan 2010. We need to store data in this fashion in DW. Also PSFT has Exchange Rate stored as RATE_MULT and RATE_DIV. We need to do a RATE_MULT/RATE_DIV to get the correct Exchange Rate. We generate From Date and To Date while extracting data from source and this has certain assumptions: If a record gets updated/inserted in the source, it will be extracted in incremental. Also if this updated/inserted record is between other dates, then we also extract the preceding and succeeding records (based on dates) of this record. This is required because we need to generate a range record and we have 3 records whose ranges have changed. Taking the same example as above, if there is a new record which gets inserted on 15 Jan 2010; the new ranges are 1 Jan to 14 Jan, 15 Jan to 30 Jan and 31 Jan to Next available date. Even though 1 Jan record and 31 Jan have not changed, we will still extract them because the range is affected. Similar logic is used for Global Exchange Rate Extraction. We create the Range records and get it into a Temporary table. Then we join to Day Dimension, create individual records and pivot the data to get the 5 Global Exchange Rates for each From Currency, Date and Rate Type. Siebel: Siebel Facts are dependent on Global Exchange Rates heavily and almost none of them really use individual Exchange Rates. In other words, W_GLOBAL_EXCH_RATE_G is the main table used in Siebel from PS1 release onwards. As of January 2002, the Euro Triangulation method for converting between currencies belonging to EMU members is not needed for present and future currency exchanges. However, the method is still available in Siebel applications, as are the old currencies, so that historical data can be maintained accurately. The following description applies only to historical data needing conversion prior to the 2002 switch to the Euro for the EMU member countries. If a country is a member of the European Monetary Union (EMU), you should convert its currency to other currencies through the Euro. This is called triangulation, and it is used whenever either currency being converted has EMU Triangulation checked. Due to this, there are multiple extraction flows in SEBL ie. EUR to EMU, EUR to NonEMU, EUR to DMC and so on. We load W_EXCH_RATE_G through multiple flows with these data. This has been kept same as previous versions of OBIA. W_GLOBAL_EXCH_RATE_G being a new table does not have such needs. However SEBL does not have From Date and To Date columns in the Source tables similar to PSFT. We use similar extraction logic as explained in PSFT section for SEBL as well. What if all 5 Global Currencies configured are same? As mentioned in previous sections, from PS1 onwards we store Global Exchange Rates in W_GLOBAL_EXCH_RATE_G table. The extraction logic for this table involves Pivoting data from multiple rows into a single row with 5 Global Exchange Rates in 5 columns. As mentioned in previous sections, we use CASE and GROUP BY functions to achieve this. This approach poses a unique problem when all the 5 Global Currencies Chosen are same. For example – If the user configures all 5 Global Currencies as ‘USD’ then the extract logic will not be able to generate a record for From Currency=USD. This is because, not all Source Systems will have a USD->USD conversion record. We have _Generated mappings to take care of this case. We generate a record with Conversion Rate=1 for such cases. Reusable Lookups Before PS1, we had a Mapplet for Currency Conversions. In PS1, we only have reusable Lookups- LKP_W_EXCH_RATE_G and LKP_W_GLOBAL_EXCH_RATE_G. These lookups have another layer of logic so that all the lookup conditions are met when they are used in various Fact Mappings. Any user who would want to do a LKP on W_EXCH_RATE_G or W_GLOBAL_EXCH_RATE_G should and must use these Lookups. A direct join or Lookup on the tables might lead to wrong data being returned. Changing Currency preferences in the Dashboard: In the 796x series, all amount metrics in OBIA were showing the Global1 amount. The customer needed to change the metric definitions to show them in another Currency preference. Project Analytics started supporting currency preferences since 7.9.6 release though, and it published a Tech note for other module customers to add toggling between currency preferences to the solution. List of Currency Preferences Starting from 11.1.1.x release, the BI Platform added a new feature to support multiple currencies. The new session variable (PREFERRED_CURRENCY) is populated through a newly introduced currency prompt. This prompt can take its values from the xml file: userpref_currencies_OBIA.xml, which is hosted in the BI Server installation folder, under :< home>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1\userpref_currencies.xml This file contains the list of currency preferences, like“Local Currency”, “Global Currency 1”,…which customers can also rename to give them more meaningful business names. There are two options for showing the list of currency preferences to the user in the dashboard: Static and Dynamic. In Static mode, all users will see the full list as in the user preference currencies file. In the Dynamic mode, the list shown in the currency prompt drop down is a result of a dynamic query specified in the same file. Customers can build some security into the rpd, so the list of currency preferences will be based on the user roles…BI Applications built a subject area: “Dynamic Currency Preference” to run this query, and give every user only the list of currency preferences required by his application roles. Adding Currency to an Amount Field When the user selects one of the items from the currency prompt, all the amounts in that page will show in the Currency corresponding to that preference. For example, if the user selects “Global Currency1” from the prompt, all data will be showing in Global Currency 1 as specified in the Configuration Manager. If the user select “Local Currency”, all amount fields will show in the Currency of the Business Unit selected in the BU filter of the same page. If there is no particular Business Unit selected in that filter, and the data selected by the query contains amounts in more than one currency (for example one BU has USD as a functional currency, the other has EUR as functional currency), then subtotals will not be available (cannot add USD and EUR amounts in one field), and depending on the set up (see next paragraph), the user may receive an error. There are two ways to add the Currency field to an amount metric: In the form of currency code, like USD, EUR…For this the user needs to add the field “Apps Common Currency Code” to the report. This field is in every subject area, usually under the table “Currency Tag” or “Currency Code”… In the form of currency symbol ($ for USD, € for EUR,…) For this, the user needs to format the amount metrics in the report as a currency column, by specifying the currency tag column in the Column Properties option in Column Actions drop down list. Typically this column should be the “BI Common Currency Code” available in every subject area. Select Column Properties option in the Edit list of a metric. In the Data Format tab, select Custom as Treat Number As. Enter the following syntax under Custom Number Format: [$:currencyTagColumn=Subjectarea.table.column] Where Column is the “BI Common Currency Code” defined to take the currency code value based on the currency preference chosen by the user in the Currency preference prompt.

    Read the article

  • javascript arrays and type conversion inconsistencies

    - by ForYourOwnGood
    I have been playing with javascript arrays and I have run into, what I feel, are some inconsistencies, I hope someone can explain them for me. Lets start with this: var myArray = [1, 2, 3, 4, 5]; document.write("Length: " + myArray.length + "<br />"); for( var i in myArray){ document.write( "myArray[" + i + "] = " + myArray[i] + "<br />"); } document.write(myArray.join(", ") + "<br /><br />"); Length: 5 myArray[0] = 1 myArray[1] = 2 myArray[2] = 3 myArray[3] = 4 myArray[4] = 5 1, 2, 3, 4, 5 There is nothing special about this code, but I understand that a javascript array is an object, so properities may be add to the array, the way these properities are added to an array seems inconsistent to me. Before continuing, let me note how string values are to be converted to number values in javascript. Nonempty string - Numeric value of string or NaN Empty string - 0 So since a javascript array is an object the following is legal: myArray["someThing"] = "someThing"; myArray[""] = "Empty String"; myArray["4"] = "four"; for( var i in myArray){ document.write( "myArray[" + i + "] = " + myArray[i] + "<br />"); } document.write(myArray.join(", ") + "<br /><br />"); Length: 5 myArray[0] = 1 myArray[1] = 2 myArray[2] = 3 myArray[3] = 4 myArray[4] = four myArray[someThing] = someThing myArray[] = Empty String 1, 2, 3, 4, four The output is unexpected. The non empty string "4" is converted into its numeric value when setting the property myArray["4"], this seems right. However the empty string "" is not converted into its numeric value, 0, it is treated as an empty string. Also the non empty string "something" is not converted to its numeric value, NaN, it is treated as a string. So which is it? is the statement inside myArray[] in numeric or string context? Also, why are the two, non numeric, properities of myArray not included in myArray.length and myArray.join(", ")?

    Read the article

  • Sharepoint Document Library not receiving emails

    - by ria
    I have created a sharepoint document library which is email enabled. However when i send email to the designated email address from anywhere, i dont receive the email & attachment in the list. I have done some R&D and i hv found that in order to receive email from anywhere i have to expose my DNS of the sharepoint site to the outside world. Now i dont know whether it applies to the email address designated to me in the Active directory profile as well (my company domain email address). How to test that this email reception is working in the document library? I have tried sending an email from the sharepoint site and it works fine so the SMTP settings are correctly done.

    Read the article

  • Using Default Document with Forms Authentication

    - by John Rabotnik
    I have a site hosted on IIS7 with a default document specified as default.aspx. This works fine but my app uses Forms Authentication and I want to disable Anonymous Authentication completely. When I do disable anonymous authentication for everything except the login page, everything works fine but the default document setting stops working. With Anonymous authentication switched on if I visit http://mysite I get passed to http://mysite/default.aspx (which then redirects to the login page if the user hasn't already logged in) If I disable anonymous authentication (leaving only forms based auth enabled) and I visit http://mysite I get a permission denied page from IIS. Yet, if I visit http://mysite/default.aspx directly then the site works fine. I just want to disable anonymous authentication and have http://mysite go to http://mysite/default.aspx. Any ideas would be greatly appreciated.

    Read the article

  • Sharepoint Document Library not receiving emails

    - by ria
    I have created a sharepoint document library which is email enabled. However when i send email to the designated email address from anywhere, i dont receive the email & attachment in the list. I have done some R&D and i hv found that in order to receive email from anywhere i have to expose my DNS of the sharepoint site to the outside world. Now i dont know whether it applies to the email address designated to me in the Active directory profile as well (my company domain email address). How to test that this email reception is working in the document library? I have tried sending an email from the sharepoint site and it works fine so the SMTP settings are correctly done.

    Read the article

  • General type conversion without risking Exceptions

    - by Mongus Pong
    I am working on a control that can take a number of different datatypes (anything that implements IComparable). I need to be able to compare these with another variable passed in. If the main datatype is a DateTime, and I am passed a String, I need to attempt to convert the String to a DateTime to perform a Date comparison. if the String cannot be converted to a DateTime then do a String comparison. So I need a general way to attempt to convert from any type to any type. Easy enough, .Net provides us with the TypeConverter class. Now, the best I can work out to do to determine if the String can be converted to a DateTime is to use exceptions. If the ConvertFrom raises an exception, I know I cant do the conversion and have to do the string comparison. The following is the best I got : string theString = "99/12/2009"; DateTime theDate = new DateTime ( 2009, 11, 1 ); IComparable obj1 = theString as IComparable; IComparable obj2 = theDate as IComparable; try { TypeConverter converter = TypeDescriptor.GetConverter ( obj2.GetType () ); if ( converter.CanConvertFrom ( obj1.GetType () ) ) { Console.WriteLine ( obj2.CompareTo ( converter.ConvertFrom ( obj1 ) ) ); Console.WriteLine ( "Date comparison" ); } } catch ( FormatException ) { Console.WriteLine ( obj1.ToString ().CompareTo ( obj2.ToString () ) ); Console.WriteLine ( "String comparison" ); } Part of our standards at work state that : Exceptions should only be raised when an Exception situation - ie. an error is encountered. But this is not an exceptional situation. I need another way around it. Most variable types have a TryParse method which returns a boolean to allow you to determine if the conversion has succeeded or not. But there is no TryConvert method available to TypeConverter. CanConvertFrom only dermines if it is possible to convert between these types and doesnt consider the actual data to be converted. The IsValid method is also useless. Any ideas? EDIT I cannot use AS and IS. I do not know either data types at compile time. So I dont know what to As and Is to!!! EDIT Ok nailed the bastard. Its not as tidy as Marc Gravells, but it works (I hope). Thanks for the inpiration Marc. Will work on tidying it up when I get the time, but I've got a bit stack of bugfixes that I have to get on with. public static class CleanConverter { /// <summary> /// Stores the cache of all types that can be converted to all types. /// </summary> private static Dictionary<Type, Dictionary<Type, ConversionCache>> _Types = new Dictionary<Type, Dictionary<Type, ConversionCache>> (); /// <summary> /// Try parsing. /// </summary> /// <param name="s"></param> /// <param name="value"></param> /// <returns></returns> public static bool TryParse ( IComparable s, ref IComparable value ) { // First get the cached conversion method. Dictionary<Type, ConversionCache> type1Cache = null; ConversionCache type2Cache = null; if ( !_Types.ContainsKey ( s.GetType () ) ) { type1Cache = new Dictionary<Type, ConversionCache> (); _Types.Add ( s.GetType (), type1Cache ); } else { type1Cache = _Types[s.GetType ()]; } if ( !type1Cache.ContainsKey ( value.GetType () ) ) { // We havent converted this type before, so create a new conversion type2Cache = new ConversionCache ( s.GetType (), value.GetType () ); // Add to the cache type1Cache.Add ( value.GetType (), type2Cache ); } else { type2Cache = type1Cache[value.GetType ()]; } // Attempt the parse return type2Cache.TryParse ( s, ref value ); } /// <summary> /// Stores the method to convert from Type1 to Type2 /// </summary> internal class ConversionCache { internal bool TryParse ( IComparable s, ref IComparable value ) { if ( this._Method != null ) { // Invoke the cached TryParse method. object[] parameters = new object[] { s, value }; bool result = (bool)this._Method.Invoke ( null, parameters); if ( result ) value = parameters[1] as IComparable; return result; } else return false; } private MethodInfo _Method; internal ConversionCache ( Type type1, Type type2 ) { // Use reflection to get the TryParse method from it. this._Method = type2.GetMethod ( "TryParse", new Type[] { type1, type2.MakeByRefType () } ); } } }

    Read the article

  • Middel East XML Currency Conversion

    - by Tim
    Hi, Using the following script to do currency conversion which relies on an xml feed. http://www.white-hat-web-design.co.uk/articles/php-currency-conversion.php It grabs the data from the following feed... var $xml_file = "www.ecb.int/stats/eurofxref/eurofxref-daily.xml"; However this xml feed has limited currencies and i require currencies for the middle east. Does anyone know where i can find an xml file with middle east currencies or have any better suggestions? Any help would be appreciated.

    Read the article

  • Can't find which row is causing conversion error

    - by Marwan
    I have the following table: CREATE TABLE [dbo].[Accounts1]( [AccountId] [nvarchar](50) NULL, [ExpiryDate] [nvarchar](50) NULL ) I am trying to convert nvarchar to datetime using this query: select convert(datetime, expirydate) from accounts I get this error: Conversion failed when converting datetime from character string. The status bar says "2390 rows". I go to rows 2390, 2391 and 2392. There is nothing wrong with the data there. I even try to convert those particular rows and it works. How can I find out which row(s) is causing the conversion error?

    Read the article

  • Undetermined type conversion in VB.NET 2008

    - by user337501
    I figured this would be a quick google, but extensive searching hasnt yielded any results. Everything about type conversion seems to dance around this concept. I want to get the type of variable "a", and make a new variable named "b" of that type. Otherwise I could have "a" as a type already declared and "b" simply as an Object, then try to cast "b" to the type of "a". Dim a As Integer Dim b As Whatever a Is OR TryCast(b, Whatever a Is) I would also like to make the conversion using a variable representation of the type, but cant find info on how to do that either. Sorta like: Dim a As Integer Dim b As Object Dim t As Type t = a.GetType() TryCast(b, t) Realizing I'm completely misusing TryCast here, I'm mostly trying to get my goal across. I figured it would be an easy quick thing to do but I cant really find any specific info on it. Any ideas?

    Read the article

  • Data loss between conversion

    - by Alex Brooks
    Why is it that I loose data between the conversions below even though both types take up the same amount of space? If the conversion was done bitwise, it should be true that x = z unless data is being stripped during the conversion, right? Is there a way to do the two conversions without losing data (i.e. so that x = z)? main.cpp: #include <stdio.h> #include <stdint.h> int main() { double x = 5.5; uint64_t y = static_cast<uint64_t>(x); double z = static_cast<double>(y) // Desire : z = 5.5; printf("Size of double: %lu\nSize of uint64_t: %lu\n", sizeof(double), sizeof(uint64_t)); printf("%f\n%lu\n%f\n", x, y, z); } Results: Size of double: 8 Size of uint64_t: 8 5.500000 5 5.000000

    Read the article

  • Easy way for Crystal Reports to MS SQL Server Reporting Services conversion

    - by scoob
    Is there a way to easily convert Crystal Reports reports to Reporting Services RDL format? We have quite a few reports that will be needing conversion soon. I know about the manual process (which is basically rebuilding all your reports from scratch in SSRS), but my searches pointed to a few possibilities with automatic conversion "acceleration" with several consulting firms. (As described on http://www.microsoft.com/sql/technologies/reporting/partners/crystal-migration.mspx). Do any of you have any valid experiences or recomendations regarding this particular issue? Are there any tools around that I do not know about?

    Read the article

  • Should I convert my AAC M4A files to MP3?

    - by j0rd4n
    Due to Apple, I have a large majority of my music files in the AAC M4A format. They do NOT have DRM so I don't have to worry about that. I'm getting tired of Apple products and really want to switch to a different brand player (and something more compatible with Linux). It appears most MP3 players support...well...MP3 and not AAC. Should I convert my library to be free of Apple and open to other players? Is this a lossless conversion? Can it be lossless? If I will lose quality, I'm not interested. Am I even doing the right thing? AAC is the better format, but I'm not seeing a lot of support for it yet. I'll be honest and say that I need some education in this department. Any helpful advice is most welcome.

    Read the article

  • IIS7 default document for urlMapped url throws 403 error

    - by MorningZ
    Hopefully this all makes sense: I have a Web Application project against an IIS7 server that is "theme-able" using different master pages. As a result of what I am trying to do, the root of the project has no aspx files, so I am using the web.config's ability to rewrite "~/default.aspx" to "~/themes/a/default.aspx" this works great as long as i type in "http://www.mysite.com/default.aspx", but typing just "http://www.mysite.com" results in a "403 - Forbidden: Access is denied" error I was hoping that the combination of urlMapping and default document would be smart enough to handle this, but it's not <system.webServer> <defaultDocument enabled="true"> <files> <clear /> <add value="default.aspx"/> </files> </defaultDocument> </system.webServer> i also tried <system.webServer> <defaultDocument enabled="true"> <files> <clear /> <add value="~/themes/a/default.aspx"/> </files> </defaultDocument> </system.webServer> to no avail I was hoping a browser would come in without a document defined, IIS7 would assume it was default.aspx, and then the urlMapping would map it accordingly, but nope any pointers? I've read a ton of posts here with similar issues, but not the exact issue

    Read the article

  • Ms Excel Problem Linking Range in Source Document to Custom Function in Target Document

    - by user261935
    I have some customer MS Excel VBA code (MS Excel 2007) that takes a range as input and then does some work on it (it is quite a large range). I want to use a separate excel document as the source of the range data. If I have both the source and target document open then the function works just fine. If I have only the target document open I get #Value! returned and stepping through in the debugger I see "Error 2023" in the data value passed in. Any ideas how I make this work without having to open both spreadsheets simultaneously?

    Read the article

  • Would a model like this translate well to a document or graph database?

    - by Eric
    I'm trying to understand what types of models that I have traditionally persisted relationally would translate well to some kind of NoSQL database. Suppose I have a model with the following relationships: Product 1-----0..N Order Customer 1-----0..N Order And suppose I need to frequently query things like All Orders, All Products, All Customers, All Orders for Given Customer, All Orders for Given Product. My feeling is that this kind of model would not denormalize cleanly - If I had Product and Customer documents with embedded Orders, both documents would have duplicate orders. So I think I'd need separate documents for all three entities. Does a characteristic like this typically indicate that a document database is not well suited for a given model? Generally speaking, would a document database perform as well as a relational database in this kind of situation? I know very little about graph databases, but I understand that a graph database handles relationships more performantly than a document database - would a graph database be suited for this kind of model?

    Read the article

  • Opening Word Document from IE

    - by Nalum
    Hello all, I'm opening a word document through IE on a local network, it opens up fine but if a document is password protected then it should prompt for the password which it doesn't. Is there something that I should be doing to get the password prompt? The way I'm opening the document is by a link on a web page e.g. <a href="\\path\to\file.doc">Document</a> Thanks for any help.

    Read the article

  • doxilion document converter alternative

    - by Nrew
    Do you know of any alternative to doxilion document converter. because when I try to convert .doc files into .pdf. The images is removed and the output .pdf file will only contain text. Please not the online converter. Because I have slow internet.

    Read the article

  • Excel: Cell Value as Excel Document Metadata Property

    - by mjlefevre
    I know you can add custom Document Properties in Excel 2007. see: http://office.microsoft.com/en-us/excel/HA100475241033.aspx#5 But I thought there was a way to pull a value from a cell as a custom metadata property without code. Maybe I'm searching for it with the wrong terminology. Maybe it has to be done as a Named Range. I know this can be done. Anyone know how?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >