tag:blogger.com,1999:blog-15319370.post2490397594905104651..comments2024-03-05T11:16:00.846+01:00Comments on Roland Bouman's blog: When kettle's "Get data From XML" is bombed by the BOM rpboumanhttp://www.blogger.com/profile/13365137747952711328noreply@blogger.comBlogger5125tag:blogger.com,1999:blog-15319370.post-36420713490602279312017-05-21T19:46:22.327+02:002017-05-21T19:46:22.327+02:00> Unfortunately, your github commit link is bro...> Unfortunately, your github commit link is broken. <br /><br />I'm sorry about that.<br /><br />> but once it gets to Kettle it's no longer recognized as a valid gzip file.<br /><br />Not entirely sure what you mean. But have you tried to save the response to file and then tried to uncompress it?rpboumanhttps://www.blogger.com/profile/13365137747952711328noreply@blogger.comtag:blogger.com,1999:blog-15319370.post-79415866491152768992017-05-21T12:21:42.232+02:002017-05-21T12:21:42.232+02:00Hi Roland, thanks for this info. You're the O...Hi Roland, thanks for this info. You're the ONLY person I've been able to find that has even referenced a problem with gzip web response data in-stream:<br /><br />"UPDATE2: Apart from the BOM, there seems to be a separate, independent problem when the XML is acquired from a URL and the server uses gzip compression."<br /><br />Unfortunately, your github commit link is broken. I'm using the "HTTP POST" step to access json data from a server that's coming back GZIPed. I know the data response from the server is good (tested by routing pdi request through Fiddler), but once it gets to Kettle it's no longer recognized as a valid gzip file. Have you had similar issues? Any suggestions?rcgardnenoreply@blogger.comtag:blogger.com,1999:blog-15319370.post-85645288609520653742015-11-28T06:44:46.006+01:002015-11-28T06:44:46.006+01:00Thanks for the guide, it helped me! I had a simila...Thanks for the guide, it helped me! I had a similar issue with JSON consuming a REST API that was sending the data as application/json and UTF8 with BOM. <br /><br />For fix it, I create a Java Expression with this code<br /><br />response.substring(1, document.length())<br /><br />And I was able to parse the JSON using JSON Input.Mauricio Murillohttps://www.blogger.com/profile/11581762049079948673noreply@blogger.comtag:blogger.com,1999:blog-15319370.post-80178075491205217882014-08-01T22:55:07.605+02:002014-08-01T22:55:07.605+02:00Hi unknown.
Thanks for your interest.
Look, qu...Hi unknown. <br /><br />Thanks for your interest. <br /><br />Look, questions like "do you think your fix will work" are really not that smart - it's not like I'm trying hard to write about stuff that doesn't work. <br /><br />Obviously I'm not guaranteeing anything; I simply documented what worked for me. If I were you, I'd try and see where I'd get stuck. And maybe if I'd find out something specific to your specific problem, I'd blog about that just like I have that.<br /><br />So why don't you go ahead and try? I did my utmost best to explain exactly what I did. You only have to repeat and see if it works for you.<br />rpboumanhttps://www.blogger.com/profile/13365137747952711328noreply@blogger.comtag:blogger.com,1999:blog-15319370.post-70443837534066790252014-08-01T21:20:45.931+02:002014-08-01T21:20:45.931+02:00I have a similar issue. I'm using DI 4.4.2. My...I have a similar issue. I'm using DI 4.4.2. My process reads text file with utf8 with BOM encoding. I already specifying utf8 encoding and and BOM characters are gone in preview the records, but it errors out when data gets inserted into a table. Do you think your fix would work? ThanksLeo Ghttps://www.blogger.com/profile/10867164671060494725noreply@blogger.com