我正在尝试用Python为胡迪尼建立一个自定义的LiDAR导入器。到目前为止,Laspy模块(https://pypi.org/project/laspy)通过在Houdini中读取和写入*.*las文件以及过滤分类,做了一件很好的、快速的工作。
但我必须读写*.*las文件,然后再次导入,而不是立即获得胡迪尼内部的点。
现在我在想,我是否可以获取LiDAR点xyz位置,以在胡迪尼内部的点上提供它们。
我试图在Laspy手册中找到有用的信息,但找不到任何示例或函数。
我用一个*.*csv文件做了一些类似的事情,它包含xyz位置来构建一个简单的GPS读取器,将位置作为Houdini中的点输出(使用csv模块)。
我附上了一个截图与原始的.las (灰色)和过滤的output.las (红色屋顶)和脚本示例从Laspy手册。
也许代替Laspy有一个更优雅的解决方案?我在胡迪尼中使用的是Python 3,但2.7也可以。
更新,答案从这里工作,几乎完美的https://forums.odforce.net/topic/46475-custom-lidar-importer-with-python/?tab=comments#comment-217104:
from laspy.file import File
import numpy as np
node = hou.pwd()
geo = node.geometry()
file_path = geo.attribValue("file_path")
inFile = File(file_path, mode='r')
# --- load point position
coords = np.vstack((inFile.x, inFile.y, inFile.z)).transpose()
scale = np.array(inFile.header.scale)
offset = np.array(inFile.header.offset) # there is no offset in simple.las example from laspy library
# offset = np.array([1000, 20000, 100000]) # just for testing that offset works
# geo.setPointFloatAttribValues("P", np.concatenate(coords)) # same as Lidar Import SOP - seems that it ignores scale (and offset?)
geo.setPointFloatAttribValues("P", np.concatenate(coords*scale+offset))
# --- load color
color = np.vstack((inFile.red, inFile.green, inFile.blue)).transpose()
geo.addAttrib(hou.attribType.Point, "Cd", (1.0,1.0,1.0), False, False) # add color atttribute
geo.setPointFloatAttribValues("Cd", np.concatenate(color / 255.0)) # transform from 1-255 to 0.0-1.0 range)THe唯一还不能工作的是正在崩溃的inFile.classifications == x。
发布于 2020-07-12 02:36:08
来自Odforce的Petr完成了LiDAR导入器。https://forums.odforce.net/topic/46475-custom-lidar-importer-with-python/
加载和读取点的步骤:
from laspy.file import File
node = hou.pwd()
geo = node.geometry()
geo.createPoint() # dummy point for point generate
classification = hou.evalParm("classification")
file_path = hou.evalParm("lidar_file")
inFile = File(file_path, mode='r')
# store how many points lidar attribute has
geo.addAttrib(hou.attribType.Global, "nb_of_points", 0, False, False)
geo.setGlobalAttribValue("nb_of_points", len(inFile.points[inFile.Classification == classification]))
# store file path
geo.addAttrib(hou.attribType.Global, "file_path", "", False, False)
geo.setGlobalAttribValue("file_path", file_path)
# store classification
geo.addAttrib(hou.attribType.Global, "classification", 0, False, False)
geo.setGlobalAttribValue("classification", classification)
# find availible information
pointformat = inFile.point_format
for spec in inFile.point_format:
print(spec.name)并加载属性:
from laspy.file import File
import numpy as np
node = hou.pwd()
geo = node.geometry()
file_path = geo.attribValue("file_path")
inFile = File(file_path, mode='r')
classification = geo.attribValue("classification")
selection = inFile.Classification == classification
points = inFile.points[selection]
# --- load point position
coords = np.vstack((inFile.x, inFile.y, inFile.z)).transpose()
scale = np.array(inFile.header.scale)
offset = np.array(inFile.header.offset) # there is no offset in simple.las example from laspy library
# offset = np.array([1000, 20000, 100000]) # just for testing that offset works
# geo.setPointFloatAttribValues("P", np.concatenate(coords)) # same as Lidar Import SOP - seems that it ignores scale (and offset?)
geo.setPointFloatAttribValues("P", np.concatenate(coords[selection]*scale+offset))
# --- load color
missing_color = ["red", "green", "blue"]
for spec in inFile.point_format:
if spec.name in missing_color:
missing_color.remove(spec.name)
if not missing_color:
color = np.vstack((inFile.red, inFile.green, inFile.blue)).transpose()
geo.addAttrib(hou.attribType.Point, "Cd", (1.0,1.0,1.0), False, False) # add color atttribute
geo.setPointFloatAttribValues("Cd", np.concatenate(color[selection] / 255.0)) # transform from 1-255 to 0.0-1.0 range)
# --- load intensity
geo.addAttrib(hou.attribType.Point, "intensity", 0.0, False, False) # add intensity atttribute
geo.setPointFloatAttribValues("intensity", inFile.intensity[selection] / 512.0) # transform from 1-255 to 0.0-1.0 range)https://stackoverflow.com/questions/62848050
复制相似问题